language-icon Old Web
English
Sign In

Entropy estimation

In various science/engineering applications, such as independent component analysis, image analysis, genetic analysis, speech recognition, manifold learning, evaluation of the status of biological systemsand time delay estimation it is useful to estimate the differential entropy of a system or process, given some observations. In various science/engineering applications, such as independent component analysis, image analysis, genetic analysis, speech recognition, manifold learning, evaluation of the status of biological systemsand time delay estimation it is useful to estimate the differential entropy of a system or process, given some observations. The simplest and most common approach uses histogram-based estimation, but other approaches have been developed and used, each with its own benefits and drawbacks. The main factor in choosing a method is often a trade-off between the bias and the variance of the estimate although the nature of the (suspected) distribution of the data may also be a factor. The simple way of evaluation of a probability distribution f ( x ) {displaystyle f(x)} of biological variable with the entropy normalized by its maximum value ( H m a x = log ⁡ n {displaystyle H_{max}=log n} ) , demonstrates advantages over standard physiological indices in the estimation of functional status of cardiovascular, nervous and immune systems.

[ "Principle of maximum entropy", "Estimator", "Entropy (information theory)" ]
Parent Topic
Child Topic
    No Parent Topic