Taylor expansions for the moments of functions of random variables

In probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite. In probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite. Since E [ X − μ X ] = 0 , {displaystyle E=0,} the second term disappears. Also E [ ( X − μ X ) 2 ] {displaystyle E} is σ X 2 {displaystyle sigma _{X}^{2}} . Therefore, where μ X {displaystyle mu _{X}} and σ X 2 {displaystyle sigma _{X}^{2}} are the mean and variance of X respectively. It is possible to generalize this to functions of more than one variable using multivariate Taylor expansions. For example, Similarly, The above is using a first order approximation unlike for the method used in estimating the first moment. It will be a poor approximation in cases where f ( X ) {displaystyle f(X)} is highly non-linear. This is a special case of the delta method. For example, The second order approximation is:

[ "Taylor series" ]
Parent Topic
Child Topic
    No Parent Topic