language-icon Old Web
English
Sign In

Entropy power inequality

In information theory, the entropy power inequality is a result that relates to so-called 'entropy power' of random variables. It shows that the entropy power of suitably well-behaved random variables is a superadditive function. The entropy power inequality was proved in 1948 by Claude Shannon in his seminal paper 'A Mathematical Theory of Communication'. Shannon also provided a sufficient condition for equality to hold; Stam (1959) showed that the condition is in fact necessary. In information theory, the entropy power inequality is a result that relates to so-called 'entropy power' of random variables. It shows that the entropy power of suitably well-behaved random variables is a superadditive function. The entropy power inequality was proved in 1948 by Claude Shannon in his seminal paper 'A Mathematical Theory of Communication'. Shannon also provided a sufficient condition for equality to hold; Stam (1959) showed that the condition is in fact necessary. For a random variable X : Ω → Rn with probability density function f : Rn → R, the differential entropy of X, denoted h(X), is defined to be and the entropy power of X, denoted N(X), is defined to be In particular, N(X) = |K| 1/n when X is normal distributed with covariance matrix K. Let X and Y be independent random variables with probability density functions in the Lp space Lp(Rn) for some p > 1. Then Moreover, equality holds if and only if X and Y are multivariate normal random variables with proportional covariance matrices.

[ "Maximum entropy thermodynamics", "Entropy rate", "Transfer entropy" ]
Parent Topic
Child Topic
    No Parent Topic