Jump to content

Moment-generating function

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Erikjm (talk | contribs) at 18:38, 29 March 2012 (→‎Examples). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In probability theory and statistics, the moment-generating function of a random variable is an alternative specification of its probability distribution (however, note that not all random variables have moment-generating functions). Thus, it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or cumulative distribution functions. There are particularly simple results for the moment-generating functions of distributions defined by the weighted sums of random variables.

In addition to univariate distributions, moment-generating functions can be defined for vector- or matrix-valued random variables, and can even be extended to more general cases.

The moment-generating function does not always exist even for real-valued arguments, unlike the characteristic function. There are relations between the behavior of the moment-generating function of a distribution and properties of the distribution, such as the existence of moments.

Definition

In probability theory and statistics, the moment-generating function of a random variable X is

wherever this expectation exists.

always exists and is equal to 1.

A key problem with moment-generating functions is that moments and the moment-generating function may not exist, as the integrals need not converge absolutely. By contrast, the characteristic function always exists (because it is the integral of a bounded function on a space of finite measure), and thus may be used instead.

More generally, where , an n-dimensional random vector, one uses instead of tX:

The reason for defining this function is that it can be used to find all the moments of the distribution.[1] The series expansion of etX is:

Hence:

where mn is the nth moment.

If we differentiate MX(t) i times with respect to t and then set t = 0 we shall therefore obtain the ith moment about the origin, mi.

Examples

Distribution Moment-generating function MX(t) Characteristic function φ(t)
Bernoulli    
Geometric   ,
for  
 
Binomial B(n, p)    
Poisson Pois(λ)    
Uniform U(a, b)    
Normal N(μ, σ2)    
Chi-squared χ2k    
Gamma Γ(k, θ)    
Exponential Exp(λ)    
Multivariate normal N(μ, Σ)    
Degenerate δa    
Laplace L(μ, b)    
Cauchy Cauchy(μ, θ) not defined  
Negative Binomial NB(r, p)    

Calculation

The moment-generating function is given by the Riemann–Stieltjes integral

where F is the cumulative distribution function.

If X has a continuous probability density function ƒ(x), then MX(−t) is the two-sided Laplace transform of ƒ(x).

where mn is the nth moment.

Sum of independent random variables

If X1, X2, ..., Xn is a sequence of independent (and not necessarily identically distributed) random variables, and

where the ai are constants, then the probability density function for Sn is the convolution of the probability density functions of each of the Xi, and the moment-generating function for Sn is given by

Vector-valued random variables

For vector-valued random variables X with real components, the moment-generating function is given by

where t is a vector and is the dot product.

Important properties

The most important property of the moment-generating function is that if two distributions have the same moment-generating function, then they are identical at all points. That is, if for all values of t,

then

for all values of x (or equivalently X and Y have the same distribution). This statement is not equivalent to ``if two distributions have the same moments, then they are identical at all points", because in some cases the moments exist and yet the moment-generating function does not, because in some cases the limit

does not exist. This happens for the lognormal distribution.

Calculations of moments

The moment-generating function is so called because if it exists on an open interval around t = 0, then it is the exponential generating function of the moments of the probability distribution:

n should be nonnegative.

Other properties

Hoeffding's lemma provides a bound on the moment-generating function in the case of a zero-mean, bounded random variable.

Relation to other functions

Related to the moment-generating function are a number of other transforms that are common in probability theory:

characteristic function
The characteristic function is related to the moment-generating function via the characteristic function is the moment-generating function of iX or the moment generating function of X evaluated on the imaginary axis. This function can also be viewed as the Fourier transform of the probability density function, which can therefore be deduced from it by inverse Fourier transform.
cumulant-generating function
The cumulant-generating function is defined as the logarithm of the moment-generating function; some instead define the cumulant-generating function as the logarithm of the characteristic function, while others call this latter the second cumulant-generating function.
probability-generating function
The probability-generating function is defined as This immediately implies that

See also

References

  1. ^ Bulmer, M.G., Principles of Statistics, Dover, 1979, pp. 75–79
  • Casella, George; Berger, Roger. Statistical Inference (2nd ed.). pp. 59–68. ISBN 978-0-534-24312-8.