# Method of moments (statistics)

In statistics and econometrics, the method of moments is a method of estimation of population parameters. One starts with deriving equations that related the population moments (i.e., the expected values of powers of the random variable under consideration) to the parameters of interest. Then a sample is drawn and the population moments are estimated from the sample. The equations are then solved for the parameters of interest, using the sample moments in place of the (unknown) population moments. This results in estimates of those parameters.

## Method

Suppose that the problem is to estimate $k$ unknown parameters $\theta_{1}, \theta_{2}, \dots, \theta_{k}$ characterizing the distribution $f_{W}(w; \theta)$ of the random variable $W$. Suppose the first $k$ moments of the true distribution (the "population moments") can be expressed as functions of the $\theta$s:

$\mu_{1} \equiv E[W^1]=g_{1}(\theta_{1}, \theta_{2}, \dots, \theta_{k}) ,$
$\mu_{2} \equiv E[W^2]=g_{2}(\theta_{1}, \theta_{2}, \dots, \theta_{k}) ,$
$\vdots$
$\mu_{k} \equiv E[W^k]=g_{k}(\theta_{1}, \theta_{2}, \dots, \theta_{k}) .$

Suppose a sample of size $n$ is drawn, resulting in the values $w_1, \dots, w_n$. For $j=1,\dots,k$, let

$\hat{\mu_{j}}=\frac{1}{n}\sum_{i=1}^{n} w_{i}^{j}$

be the j-th sample moment, an estimate of $\mu_{j}$. The method of moments estimator for $\theta_{1}, \theta_{2}, \dots, \theta_{k}$ denoted by $\hat{\theta}_{1}, \hat{\theta_{2}}, \dots, \hat{\theta}_{k}$ is defined as the solution (if there is one) to the equations:[citation needed]

$\hat \mu_{1} = g_{1}(\hat{\theta}_{1}, \hat{\theta_{2}}, \dots, \hat{\theta}_{k}) ,$
$\hat \mu_{2} = g_{2}(\hat{\theta}_{1}, \hat{\theta_{2}}, \dots, \hat{\theta}_{k}) ,$
$\vdots$
$\hat \mu_{k} = g_{k}(\hat{\theta}_{1}, \hat{\theta_{2}}, \dots, \hat{\theta}_{k}) .$

## Example

Suppose X1, ..., Xn are independent identically distributed random variables with a gamma distribution with probability density function

${x^{\alpha-1} e^{-x/\beta} \over \beta^\alpha\, \Gamma(\alpha)} \,\!$

for x > 0, and 0 for x < 0.

The first moment, i.e., the expected value, of a random variable with this probability distribution is

$\operatorname{E}(X_1)=\alpha\beta\,$

and the second moment, i.e., the expected value of its square, is

$\operatorname{E}(X_1^2)=\beta^2\alpha(\alpha+1).\,$

These are the "population moments".

The first and second "sample moments" m1 and m2 are respectively

$m_{1} = {X_1+\cdots+X_n \over n} \,\!$

and

$m_{2} = {X_1^2+\cdots+X_n^2 \over n}.\,\!$

Equating the population moments with the sample moments, we get

$\alpha\beta = m_{1} \,\!$

and

$\beta^2\alpha(\alpha+1) = m_{2}.\,\!$

Solving these two equations for α and β, we get

$\alpha={ m_{1}^2 \over m_{2} - m_{1}^2}\,\!$

and

$\beta={ m_{2} - m_{1}^2 \over m_{1}}.\,\!$

We then use these 2 quantities as estimates, based on the sample, of the two unobservable population parameters α and β.