language-icon Old Web
English
Sign In

Standardized moment

In probability theory and statistics, a standardized moment of a probability distribution is a moment (normally a higher degree central moment) that is normalized. The normalization is typically a division by an expression of the standard deviation which renders the moment scale invariant. This has the advantage that such normalized moments differ only in other properties than variability, facilitating e.g. comparison of shape of different probability distributions. Let X be a random variable with a probability distribution P and mean value μ = E [ X ] { extstyle mu =mathrm {E} } (i.e. the first raw moment or moment about zero), the operator E denoting the expected value of X. Then the standardized moment of degree k is μ k σ k , {displaystyle {frac {mu _{k}}{sigma ^{k}}},} that is, the ratio of the kth moment about the mean to the kth power of the standard deviation, The power of k is because moments scale as x k , {displaystyle x^{k},} meaning that μ k ( λ X ) = λ k μ k ( X ) : {displaystyle mu _{k}(lambda X)=lambda ^{k}mu _{k}(X):} they are homogeneous functions of degree k, thus the standardized moment is scale invariant. This can also be understood as being because moments have dimension; in the above ratio defining standardized moments, the dimensions cancel, so they are dimensionless numbers.

[ "Geometric standard deviation", "Central moment" ]
Parent Topic
Child Topic
    No Parent Topic