Taylor expansions for the moments of functions of random variables

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

In probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite.

First moment[edit]

Since the second term disappears. Also is . Therefore,

where and are the mean and variance of X respectively.[1]

It is possible to generalize this to functions of more than one variable using multivariate Taylor expansions. For example,

Second moment[edit]

Similarly,[1]

The above is using a first order approximation unlike for the method used in estimating the first moment. It will be a poor approximation in cases where is highly non-linear. This is a special case of the delta method. For example,

The second order approximation is[2]:

See also[edit]

Notes[edit]

  1. ^ a b Haym Benaroya, Seon Mi Han, and Mark Nagurka. Probability Models in Engineering and Science. CRC Press, 2005.
  2. ^ Hendeby, Gustaf; Gustafsson, Fredrik. "ON NONLINEAR TRANSFORMATIONS OF GAUSSIAN DISTRIBUTIONS" (PDF). Retrieved 5 October 2017.