Taylor expansions for the moments of functions of random variables

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite.

First moment[edit]

Notice that , the 2nd term disappears. Also is . Therefore,

where and are the mean and variance of X respectively.[1]

It is possible to generalize this to functions of more than one variable using multivariate Taylor expansions. For example,

Second moment[edit]

Analogously,[1]

The above is using a first order approximation unlike for the method used in estimating the first moment. It will be a poor approximation in cases where is highly non-linear. This is a special case of the delta method. For example,

See also[edit]

Notes[edit]

  1. ^ a b Haym Benaroya, Seon Mi Han, and Mark Nagurka. Probability Models in Engineering and Science. CRC Press, 2005.