differential entropy

(2.1 hours to learn)


Differential entropy is a generalization of entropy to continuous random variables. It is closely related to the asymptotic entropy of increasingly fine discretizations of the continuous distribution. KL divergence and mutual information can similarly be extended to the continuous case.


This concept has the prerequisites:


  • Know the definition of differential entropy
  • Compute it for some simple examples, such as the uniform distribution and Gaussian distribution
  • Be aware that differential entropy can be negative
  • How does differential entropy relate to the asymptotic entropy of discretizations of the distribution?
  • Extend the definitions of mutual information and KL divergence to the continuous case.
    • Note: unlike differential entropy, the continuous versions of mutual information and KL divergence behave like their discrete counterparts. Therefore, in a sense, they are more fundamental.

Core resources (read/watch one of the following)


See also

-No Additional Notes-