language-icon Old Web
English
Sign In

Differential entropy

Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP). Differential entropy (described here) is commonly encountered in the literature, but it is a limiting case of the LDDP, and one that loses its fundamental association with discrete entropy. h ( X ) = − ∫ X f ( x ) log ⁡ f ( x ) d x {displaystyle h(X)=-int _{mathcal {X}}f(x)log f(x),dx} Differential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Shannon to extend the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP). Differential entropy (described here) is commonly encountered in the literature, but it is a limiting case of the LDDP, and one that loses its fundamental association with discrete entropy. Let X {displaystyle X} be a random variable with a probability density function f {displaystyle f} whose support is a set X {displaystyle {mathcal {X}}} . The differential entropy h ( X ) {displaystyle h(X)} or h ( f ) {displaystyle h(f)} is defined as:243 For probability distributions which don't have an explicit density function expression, but have an explicit quantile function expression, Q ( p ) {displaystyle Q(p)} , then h ( Q ) {displaystyle h(Q)} can be defined in terms of the derivative of Q ( p ) {displaystyle Q(p)} i.e. the quantile density function Q ′ ( p ) {displaystyle Q'(p)} as :54–59 As with its discrete analog, the units of differential entropy depend on the base of the logarithm, which is usually 2 (i.e., the units are bits). See logarithmic units for logarithms taken in different bases. Related concepts such as joint, conditional differential entropy, and relative entropy are defined in a similar fashion. Unlike the discrete analog, the differential entropy has an offset that depends on the units used to measure X {displaystyle X} .:183-184 For example, the differential entropy of a quantity measured in millimeters will be log(1000) more than the same quantity measured in meters; a dimensionless quantity will have differential entropy of log(1000) more than the same quantity divided by 1000. One must take care in trying to apply properties of discrete entropy to differential entropy, since probability density functions can be greater than 1. For example, the uniform distribution U ( 0 , 1 / 2 ) {displaystyle {mathcal {U}}(0,1/2)} has negative differential entropy

[ "Joint quantum entropy", "Binary entropy function", "Maximum entropy thermodynamics", "Maximum entropy probability distribution", "Joint entropy", "Limiting density of discrete points" ]
Parent Topic
Child Topic
    No Parent Topic