Taylor expansions for the moments of functions of random variables

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

In probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite.

First moment[edit]

Since the second term disappears. Also is . Therefore,

where and are the mean and variance of X respectively.[1]

It is possible to generalize this to functions of more than one variable using multivariate Taylor expansions. For example,

Second moment[edit]


The above is obtained using a second order approximation, following the method used in estimating the first moment. It will be a poor approximation in cases where is highly non-linear. This is a special case of the delta method.

Indeed, we take .

With , we get . The variance is then computed using the formula . In this final step, we assume that can be ignored.

An example is,

The second order approximation, when X follows a normal distribution, is:[2]

See also[edit]


  1. ^ a b Haym Benaroya, Seon Mi Han, and Mark Nagurka. Probability Models in Engineering and Science. CRC Press, 2005, p166.
  2. ^ Hendeby, Gustaf; Gustafsson, Fredrik. "ON NONLINEAR TRANSFORMATIONS OF GAUSSIAN DISTRIBUTIONS" (PDF). Retrieved 5 October 2017.

Further reading[edit]