# Standardized moment

In probability theory and statistics, the standardized moment of a probability distribution is a moment (normally a higher degree central moment) that is normalized. The normalization is typically a division by an expression of the standard deviation which renders the moment invariant to level (or scale) and variability. This has the advantage that such normalized moments differ only in other properties than level and variability facilitating e.g. comparison of shape of different probability distributions.[1]

## Standard normalization

Let X be a random variable with a probability distribution P and mean value ${\textstyle \mu =\mathrm {E} [X]}$ (i.e. the first raw moment or moment about zero), the operator E denoting the expected value of X. Then the standardized moment of degree k is ${\displaystyle {\frac {\mu _{k}}{\sigma ^{k}}}\!}$,[2] that is, a ratio of the kth moment about the mean

${\displaystyle \mu _{k}=\operatorname {E} \left[(X-\mu )^{k}\right]=\int _{-\infty }^{+\infty }(x-\mu )^{k}P(x)\mathrm {d} x}$,

and the standard deviation to the power of k

${\displaystyle \sigma ^{k}={\Bigl (}{\sqrt {\mathrm {E} [(X-\mu )^{2}]}}{\Bigr )}^{k}}$

The power of k is because moments scale as ${\displaystyle x^{k}}$, meaning that ${\displaystyle \mu _{k}(\lambda X)=\lambda ^{k}\mu _{k}(X)}$: they are homogeneous functions of degree k, thus the standardized moment is scale invariant. This can also be understood as being because moments have dimension; in the above ratio defining standardized moments, the dimensions cancel, so they are dimensionless numbers.

The first four standardized moments can be written as:

Degree k Comment
1 ${\displaystyle {\hat {\mu }}_{1}={\frac {\mu _{1}}{\sigma ^{1}}}={\frac {\operatorname {E} \left[(X-\mu )^{1}\right]}{(\operatorname {E} \left[(X-\mu )^{2}\right])^{1/2}}}={\frac {\mu -\mu }{\sqrt {\operatorname {E} \left[(X-\mu )^{2}\right]}}}=0}$ The first standardized moment is zero,

because the first moment about the mean of a mean is always zero.

2 ${\displaystyle {\hat {\mu }}_{2}={\frac {\mu _{2}}{\sigma ^{2}}}={\frac {\operatorname {E} \left[(X-\mu )^{2}\right]}{(\operatorname {E} \left[(X-\mu )^{2}\right])^{2/2}}}=1}$ The second standardized moment is one,

because the second moment about the mean is equal to the variance σ2.

3 ${\displaystyle {\hat {\mu }}_{3}={\frac {\mu _{3}}{\sigma ^{3}}}={\frac {\operatorname {E} \left[(X-\mu )^{3}\right]}{(\operatorname {E} \left[(X-\mu )^{2}\right])^{3/2}}}}$ The third standardized moment is a measure of skewness.
4 ${\displaystyle {\hat {\mu }}_{4}={\frac {\mu _{4}}{\sigma ^{4}}}={\frac {\operatorname {E} \left[(X-\mu )^{4}\right]}{(\operatorname {E} \left[(X-\mu )^{2}\right])^{4/2}}}}$ The fourth standardized moment refers to the kurtosis.

Note that for skewness and kurtosis alternative definitions exist, which are based on the third and fourth cumulant respectively.

The kth standardized moment may be generalized as:

${\displaystyle {\hat {\mu }}_{k}={\frac {\mu _{k}}{\sigma ^{k}}}={\frac {\operatorname {E} \left[(X-\mu )^{k}\right]}{(\operatorname {E} \left[(X-\mu )^{2}\right])^{k/2}}}}$

## Other normalizations

For more details on this topic, see Normalization (statistics).

Another scale invariant, dimensionless measure for characteristics of a distribution is the coefficient of variation, ${\displaystyle {\frac {\sigma }{\mu }}}$. However, this is not a standardized moment, firstly because it is a reciprocal, and secondly because ${\displaystyle \mu }$ is the first moment about zero (the mean), not the first moment about the mean (which is zero).

See Normalization (statistics) for further normalizing ratios.