differential entropy
(2.1 hours to learn)
Summary
Differential entropy is a generalization of entropy to continuous random variables. It is closely related to the asymptotic entropy of increasingly fine discretizations of the continuous distribution. KL divergence and mutual information can similarly be extended to the continuous case.
Context
This concept has the prerequisites:
- entropy
- mutual information
- KL divergence
- expectation and variance (Differential entropy is defined as an expectation)
- Gaussian distribution (Computing the differential entropy of the Gaussian distribution is an instructive example.)
Goals
- Know the definition of differential entropy
- Compute it for some simple examples, such as the uniform distribution and Gaussian distribution
- Be aware that differential entropy can be negative
- How does differential entropy relate to the asymptotic entropy of discretizations of the distribution?
- Extend the definitions of mutual information and KL divergence to the continuous case.
- Note: unlike differential entropy, the continuous versions of mutual information and KL divergence behave like their discrete counterparts. Therefore, in a sense, they are more fundamental.
Core resources (read/watch one of the following)
-Paid-
→ Elements of Information Theory
A graduate level textbook on information theory.
- Section 8.1, "Definitions," pages 243-244
- Section 8.3, "Relation of differential entropy to discrete entropy," pages 247-249
- Section 8.4, "Joint and conditional differential entropy," pages 249-250
- Section 8.5, "Relative entropy and mutual information," pages 250-252
See also
-No Additional Notes-