Characteristic function (probability theory)

In probability theory and statistics, the characteristic function of any real-valued random variable completely defines its probability distribution. If a random variable admits a probability density function, then the characteristic function is the Fourier transform of the probability density function. Thus it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or cumulative distribution functions. There are particularly simple results for the characteristic functions of distributions defined by the weighted sums of random variables. In probability theory and statistics, the characteristic function of any real-valued random variable completely defines its probability distribution. If a random variable admits a probability density function, then the characteristic function is the Fourier transform of the probability density function. Thus it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or cumulative distribution functions. There are particularly simple results for the characteristic functions of distributions defined by the weighted sums of random variables. In addition to univariate distributions, characteristic functions can be defined for vector or matrix-valued random variables, and can also be extended to more generic cases. The characteristic function always exists when treated as a function of a real-valued argument, unlike the moment-generating function. There are relations between the behavior of the characteristic function of a distribution and properties of the distribution, such as the existence of moments and the existence of a density function. The characteristic function provides an alternative way for describing a random variable. Similar to the cumulative distribution function, (where 1{X ≤ x} is the indicator function — it is equal to 1 when X ≤ x, and zero otherwise), which completely determines the behavior and properties of the probability distribution of the random variable X, the characteristic function, also completely determines the behavior and properties of the probability distribution of the random variable X. The two approaches are equivalent in the sense that knowing one of the functions it is always possible to find the other, yet they provide different insights for understanding the features of the random variable. However, in particular cases, there can be differences in whether these functions can be represented as expressions involving simple standard functions. If a random variable admits a density function, then the characteristic function is its dual, in the sense that each of them is a Fourier transform of the other. If a random variable has a moment-generating function M X ( t ) {displaystyle M_{X}(t)} , then the domain of the characteristic function can be extended to the complex plane, and Note however that the characteristic function of a distribution always exists, even when the probability density function or moment-generating function do not. The characteristic function approach is particularly useful in analysis of linear combinations of independent random variables: a classical proof of the Central Limit Theorem uses characteristic functions and Lévy's continuity theorem. Another important application is to the theory of the decomposability of random variables.

[ "Applied mathematics", "Statistics", "Mathematical analysis", "Probability density function", "Empirical characteristic function", "Wrapped distribution", "Holtsmark distribution" ]
Parent Topic
Child Topic
    No Parent Topic