(2.1 hours to learn)
Differential entropy is a generalization of entropy to continuous random variables. It is closely related to the asymptotic entropy of increasingly fine discretizations of the continuous distribution. KL divergence and mutual information can similarly be extended to the continuous case.
This concept has the prerequisites:
- Know the definition of differential entropy
- Compute it for some simple examples, such as the uniform distribution and Gaussian distribution
- Be aware that differential entropy can be negative
- How does differential entropy relate to the asymptotic entropy of discretizations of the distribution?
- Extend the definitions of mutual information and KL divergence to the continuous case.
- Note: unlike differential entropy, the continuous versions of mutual information and KL divergence behave like their discrete counterparts. Therefore, in a sense, they are more fundamental.
Core resources (read/watch one of the following)
→ Elements of Information Theory
A graduate level textbook on information theory.
- Section 8.1, "Definitions," pages 243-244
- Section 8.3, "Relation of differential entropy to discrete entropy," pages 247-249
- Section 8.4, "Joint and conditional differential entropy," pages 249-250
- Section 8.5, "Relative entropy and mutual information," pages 250-252
-No Additional Notes-
- create concept: shift + click on graph
- change concept title: shift + click on existing concept
- link together concepts: shift + click drag from one concept to another
- remove concept from graph: click on concept then press delete/backspace
- add associated content to concept: click the small circle that appears on the node when hovering over it
- other actions: use the icons in the upper right corner to optimize the graph placement, preview the graph, or download a json representation