# Taylor expansions for the moments of functions of random variables

In probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite.

## First moment

Given ${\displaystyle \mu _{X}}$ and ${\displaystyle \sigma _{X}^{2}}$, the mean and the variance of ${\displaystyle X}$, respectively,[1] a Taylor expansion of the expected value of ${\displaystyle f(X)}$ can be found via

{\displaystyle {\begin{aligned}\operatorname {E} \left[f(X)\right]&{}=\operatorname {E} \left[f\left(\mu _{X}+\left(X-\mu _{X}\right)\right)\right]\\&{}\approx \operatorname {E} \left[f(\mu _{X})+f'(\mu _{X})\left(X-\mu _{X}\right)+{\frac {1}{2}}f''(\mu _{X})\left(X-\mu _{X}\right)^{2}\right]\\&{}=f(\mu _{X})+f'(\mu _{X})\operatorname {E} \left[X-\mu _{X}\right]+{\frac {1}{2}}f''(\mu _{X})\operatorname {E} \left[\left(X-\mu _{X}\right)^{2}\right].\end{aligned}}}

Since ${\displaystyle E[X-\mu _{X}]=0,}$ the second term vanishes. Also, ${\displaystyle E[(X-\mu _{X})^{2}]}$ is ${\displaystyle \sigma _{X}^{2}}$. Therefore,

${\displaystyle \operatorname {E} \left[f(X)\right]\approx f(\mu _{X})+{\frac {f''(\mu _{X})}{2}}\sigma _{X}^{2}}$.

It is possible to generalize this to functions of more than one variable using multivariate Taylor expansions. For example,

${\displaystyle \operatorname {E} \left[{\frac {X}{Y}}\right]\approx {\frac {\operatorname {E} \left[X\right]}{\operatorname {E} \left[Y\right]}}-{\frac {\operatorname {cov} \left[X,Y\right]}{\operatorname {E} \left[Y\right]^{2}}}+{\frac {\operatorname {E} \left[X\right]}{\operatorname {E} \left[Y\right]^{3}}}\operatorname {var} \left[Y\right]}$

## Second moment

Similarly,[1]

${\displaystyle \operatorname {var} \left[f(X)\right]\approx \left(f'(\operatorname {E} \left[X\right])\right)^{2}\operatorname {var} \left[X\right]=\left(f'(\mu _{X})\right)^{2}\sigma _{X}^{2}-{\frac {1}{4}}\left(f''(\mu _{X})\right)^{2}\sigma _{X}^{4}}$

The above is obtained using a second order approximation, following the method used in estimating the first moment. It will be a poor approximation in cases where ${\displaystyle f(X)}$ is highly non-linear. This is a special case of the delta method.

Indeed, we take ${\displaystyle \operatorname {E} \left[f(X)\right]\approx f(\mu _{X})+{\frac {f''(\mu _{X})}{2}}\sigma _{X}^{2}}$.

With ${\displaystyle f(X)=g(X)^{2}}$, we get ${\displaystyle \operatorname {E} \left[Y^{2}\right]}$. The variance is then computed using the formula ${\displaystyle \operatorname {var} \left[Y\right]=\operatorname {E} \left[Y^{2}\right]-\mu _{Y}^{2}}$.

An example is,

${\displaystyle \operatorname {var} \left[{\frac {X}{Y}}\right]\approx {\frac {\operatorname {var} \left[X\right]}{\operatorname {E} \left[Y\right]^{2}}}-{\frac {2\operatorname {E} \left[X\right]}{\operatorname {E} \left[Y\right]^{3}}}\operatorname {cov} \left[X,Y\right]+{\frac {\operatorname {E} \left[X\right]^{2}}{\operatorname {E} \left[Y\right]^{4}}}\operatorname {var} \left[Y\right].}$

The second order approximation, when X follows a normal distribution, is:[2]

${\displaystyle \operatorname {var} \left[f(X)\right]\approx \left(f'(\operatorname {E} \left[X\right])\right)^{2}\operatorname {var} \left[X\right]+{\frac {\left(f''(\operatorname {E} \left[X\right])\right)^{2}}{2}}\left(\operatorname {var} \left[X\right]\right)^{2}=\left(f'(\mu _{X})\right)^{2}\sigma _{X}^{2}+{\frac {1}{2}}\left(f''(\mu _{X})\right)^{2}\sigma _{X}^{4}+\left(f'(\mu _{X})\right)\left(f'''(\mu _{X})\right)\sigma _{X}^{4}}$

## First product moment

To find a second-order approximation for the covariance of functions of two random variables (with the same function applied to both), one can proceed as follows. First, note that ${\displaystyle \operatorname {cov} \left[f(X),f(Y)\right]=\operatorname {E} \left[f(X)f(Y)\right]-\operatorname {E} \left[f(X)\right]\operatorname {E} \left[f(Y)\right]}$. Since a second-order expansion for ${\displaystyle \operatorname {E} \left[f(X)\right]}$ has already been derived above, it only remains to find ${\displaystyle \operatorname {E} \left[f(X)f(Y)\right]}$. Treating ${\displaystyle f(X)f(Y)}$ as a two-variable function, the second-order Taylor expansion is as follows:

{\displaystyle {\begin{aligned}f(X)f(Y)&{}\approx f(\mu _{X})f(\mu _{Y})+(X-\mu _{X})f'(\mu _{X})f(\mu _{Y})+(Y-\mu _{Y})f(\mu _{X})f'(\mu _{Y})+{\frac {1}{2}}\left[(X-\mu _{X})^{2}f''(\mu _{X})f(\mu _{Y})+2(X-\mu _{X})(Y-\mu _{Y})f'(\mu _{X})f'(\mu _{Y})+(Y-\mu _{Y})^{2}f(\mu _{X})f''(\mu _{Y})\right]\end{aligned}}}

Taking expectation of the above and simplifying—making use of the identities ${\displaystyle \operatorname {E} (X^{2})=\operatorname {var} (X)+\left[\operatorname {E} (X)\right]^{2}}$ and ${\displaystyle \operatorname {E} (XY)=\operatorname {cov} (X,Y)+\left[\operatorname {E} (X)\right]\left[\operatorname {E} (Y)\right]}$—leads to ${\displaystyle \operatorname {E} \left[f(X)f(Y)\right]\approx f(\mu _{X})f(\mu _{Y})+f'(\mu _{X})f'(\mu _{Y})\operatorname {cov} (X,Y)+{\frac {1}{2}}f''(\mu _{X})f(\mu _{Y})\operatorname {var} (X)+{\frac {1}{2}}f(\mu _{X})f''(\mu _{Y})\operatorname {var} (Y)}$. Hence,

{\displaystyle {\begin{aligned}\operatorname {cov} \left[f(X),f(Y)\right]&{}\approx f(\mu _{X})f(\mu _{Y})+f'(\mu _{X})f'(\mu _{Y})\operatorname {cov} (X,Y)+{\frac {1}{2}}f''(\mu _{X})f(\mu _{Y})\operatorname {var} (X)+{\frac {1}{2}}f(\mu _{X})f''(\mu _{Y})\operatorname {var} (Y)-\left[f(\mu _{X})+{\frac {1}{2}}f''(\mu _{X})\operatorname {var} (X)\right]\left[f(\mu _{Y})+{\frac {1}{2}}f''(\mu _{Y})\operatorname {var} (Y)\right]\\&{}=f'(\mu _{X})f'(\mu _{Y})\operatorname {cov} (X,Y)-{\frac {1}{4}}f''(\mu _{X})f''(\mu _{Y})\operatorname {var} (X)\operatorname {var} (Y)\end{aligned}}}

## Random vectors

If X is a random vector, the approximations for the mean and variance of ${\displaystyle f(X)}$ are given by[3]

{\displaystyle {\begin{aligned}\operatorname {E} (f(X))&=f(\mu _{X})+{\frac {1}{2}}\operatorname {trace} (H_{f}(\mu _{X})\Sigma _{X})\\\operatorname {var} (f(X))&=\nabla f(\mu _{X})^{t}\Sigma _{X}\nabla f(\mu _{X})+{\frac {1}{2}}\operatorname {trace} \left(H_{f}(\mu _{X})\Sigma _{X}H_{f}(\mu _{X})\Sigma _{X}\right).\end{aligned}}}

Here ${\displaystyle \nabla f}$ and ${\displaystyle H_{f}}$ denote the gradient and the Hessian matrix respectively, and ${\displaystyle \Sigma _{X}}$ is the covariance matrix of X.