# Talk:Moment-generating function

WikiProject Statistics (Rated Start-class, High-importance)

This article is within the scope of the WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page or join the discussion.

Start  This article has been rated as Start-Class on the quality scale.
High  This article has been rated as High-importance on the importance scale.
WikiProject Mathematics (Rated Start-class, Mid-importance)
This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of Mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Mathematics rating:
 Start Class
 Mid Importance
Field: Probability and statistics

## Hyphenation

Spelling question: I've never (before now) seen the name spelled with a hyphen. Searches of Math Reviews (MathSciNet) and Current Index to Statistics show an overwhelming preference for no hyphen. Should the title, at least, be changed (move the article to "Moment generating function" with a redirect)? Zaslav 18:08, 8 December 2006 (UTC)

Sir Ronald Fisher always used the hyphen in "moment-generating function". This is an instance of the fact that in this era the traditional hyphenation rules are usually not followed in scholarly writing, nor in advertising or package labelling, although they're still generally followed in newspapers, magazines, and novels. This particular term seldom appears in novels, advertisements, etc. Personally I prefer the traditional rules because in some cases they are a very efficient disambiguating tool. Michael Hardy 20:03, 8 December 2006 (UTC)
Some literature sources also seem to use the hyphen. Example Paul L. Meyer's Introductory Probability and Statistical Applications. Apoorv020 (talk) 20:23, 10 October 2009 (UTC)

## Terms

I would like _all_ the terms such as E to be defined explicitly. Otherwise these articles are unintelligible to the casual reader. I would have thought that all terms in any formula should be defined every any article, or else reference should be made to some common form of definition of terms for that context. How about a bit more help for the randomly browsing casual student? I would like to see a recommendation in the Wikipedia "guidelines for authors" defining some kind of standard for this, otherwise it is very arbitrary which terms are defined and which are expected to be known. —Preceding unsigned comment added by 220.253.60.249 (talkcontribs)

Seriously, this article is written for people who already know this stuff apparently. The article doesn't really say what E is, and for that matter what is t? Seriously, for M(t) what is t? mislih 20:33, 10 July 2009 (UTC)
t is the dummy argument used to define the function, like the x when one defines the squaring function ƒ by saying ƒ(x) = x2. And it says so in the second line of the article, where it says t ∈ R. If you're not following that, then what you don't know that you'd need to know to understand it is the topics of other articles, not of this one. In cases where the random variable is measured in some units, the units of t must be the reciprocal of the units of x. As for what E is, there is a link to expectation in the next line. Anyone who knows the basic facts about probability distributions has seen this notation. Michael Hardy (talk) 22:08, 10 July 2009 (UTC)

Certainly one could put in links to those things, but this article is the wrong place to explain what "E" is, just as an article about Shakespeare's plays is the wrong place to explain how to spell "Denmark", saying that the "D" represents the initial sound in "dog", etc.

This is not written for people who already know this material.

It is written for people who already know what probability distributions are and the standard basic facts about them. Michael Hardy (talk) 21:47, 10 July 2009 (UTC)

Thanks for the sarcasm, hope you feel a little better about yourself. mislih 23:32, 5 August 2009 (UTC)

It is strong and unusual to say that MGF for a rv X "...is an alternative definition
of its probability distribution."  —Preceding unsigned comment added by 71.199.181.122 (talk) 21:46, 28 September 2010 (UTC)


There is a link to the expected value operator wiki. It would probably clutter articles to have detailed explanations of each preceding idea necessary to discuss the current, but it might be a good idea to include wikis that should be understood previous to reading the current wiki. —Preceding unsigned comment added by 141.225.193.194 (talk) 01:32, 31 January 2011 (UTC)

I agree with a lot of the above, but I think it should be stated explicitly that t is just a dummy variable with no intrinsic meaning. Thomas Tvileren (talk) 20:52, 15 November 2012 (UTC)

## Definition

the definition of the n-th moment is wrong, the last equality is identically zero, as the nth derivative of 1 evaluated at t=0 will always be zero. the evaluation bar must be placed at the end (so we know we are differentiating Mx(t) n times and evaluating it at zero).

The only thing I find here that resembles a definition of the nth moment is where it says:
the nth moment is given by
$E\left(X^n\right)\,$
That definition is correct.
I take the expression
$\left.\frac{\mathrm{d}^n}{\mathrm{d}t^n}\right|_{t=0} M_X(t)$
to mean that we are first differentiating n times and then evaluating at zero. Unless you were referring to something else, your criticism seems misplaced. Michael Hardy 22:05, 8 April 2006 (UTC)

Please provide a few examples, e.g. for a Gaussian distribution.

For the Gaussian distribution
$f(x)=\frac 1 {\sqrt{2\pi}\sigma} e^{-(x-\mu)^2/2\sigma^2},$
the moment-generating function is
$M(t)=\int_{-\infty}^{\infty}e^{tx}f(x)\,dx=\frac 1 {\sqrt{2\pi}\sigma} \int_{-\infty}^{\infty} \exp \left[tx-\frac{(x-\mu)^2}{2\sigma^2}\right]dx$
$=\frac 1 {\sqrt{2\pi}\sigma}\int_{-\infty}^{\infty} \exp\left[\frac{-1}{2\sigma^2} \left((x-\mu)^2-2\sigma^2tx\right)\right]dx.$
Completing the square and simplifying, one obtains
$M(t) = \exp \left(\mu t + \frac 1 2 \sigma^2 t^2 \right).$
(mostly borrowed from article normal distribution.) I don't know if there's enough space for a complete derivation. The "completing the square" part is rather tedious. -- 130.94.162.64 00:37, 17 June 2006 (UTC)

I would also like to see some more in the article about some basic properties of the moment-generating function, such as convexity, non-negativity, the fact that M(0) always equals one, and also some other not-so-obvious properties (of which I lack knowledge) indicating what the mgf is used for. --130.94.162.64 00:55, 17 June 2006 (UTC)

Also, is it true that "Regardless of whether the probability distribution is continuous or not, the moment-generating function is given by the Riemann-Stieltjes integral" When you calculate the MGF of Poisson distribution. X~Poisson(lambda) M_X(t)=E[exp(tX)]=sum(exp(tx)*p(x),x=0,infinity) is the correct formula to use. This is clearly not an integral. Does Riemann-Stieltjes integral include summation as well? If not, the quoted statement is wrong and should be removed from the article. —Preceding unsigned comment added by Sjayzzang (talkcontribs) 20:02, 15 April 2009 (UTC)

Certainly a sum is a Riemann–Stieltjes integral. One would hope that that article would be crystal-clear about that. I'll take a look. Michael Hardy (talk) 21:55, 10 July 2009 (UTC)
...I see: that article is not explicit about that. Michael Hardy (talk) 21:56, 10 July 2009 (UTC)

## Vector of random variables or stochastic process

We should mention the case when X is a vector of random variables or a stochastic process. Jackzhp 22:29, 3 September 2006 (UTC)

## Properties would be nice

There are a whole bunch of properties of MGFs that it would be nice to include -- e.g. the MGF of a linear transformation of a random variable, MGF of a sum of independent random variables, etc.

## Discrete form of mgf

something should be added about the discrete form of the mgf, no? $M_X(s)=E(e^{sX})=\sum_{k=1}^\infty{p(k)e^{sk}}$ 24.136.121.150 08:37, 20 January 2007 (UTC)

## Common Moment Generating Functions

It would seem like a good and self-evidently obvious thing to include a link to a wikipedia page which tabulates common moment generating functions (ie: the moment generating functions for common statistical distributions), placing them online. The information is already there on wikipedia, it would just be a case of organising it a little better.

Also, there is probably some efficient way in which the set of all possible functions which commonly occur when dealing with statistical distributions can be organised to highly the possibility of inter-relationships (perhaps some mgf's are nested mgfs so that the fact that

$Mgf_3(s)=Mgf_2(Mgf_1(s))$


could be highlighted in a list of mgf interdependencies...).

ConcernedScientist (talk) 00:47, 18 February 2009 (UTC)

• Hope the changes satisfy your concerns about the page. Apoorv020 (talk) 20:19, 10 October 2009 (UTC)

## Uniqueness

We have a theorem that if two mgf's coincide in some region around 0, then corresponding random variables have same distribution. There is however a concern that this statement being true from the point of view of mathematician, is not so reliable from the point of view of applied statistician. McCullagh (1954)[1] gives following example:

$f_1(x) = \frac{1}{\sqrt{2\pi}}e^{-x^2/2}, \quad f_2(x) = f_1(x)\Big(1 + \frac{1}{2}\sin(2\pi x)\Big)$

with cumulant generating functions

$K_1(t) = t^2/2, \quad K_2(t) = K_1(t) + \ln\Big(1 + \frac{1}{2}e^{-2\pi^2}\sin(2\pi t)\Big)$

Although the densities are visibly different, their corresponding cgfs are virtually indistinguishable, with maximum difference less than 1.34•10-9 over the entire range. Thus from numerical standpoint mgfs fail to uniquely determine distribution.

On the other hand Waller (1995)[2] shows that characteristic function does much better job in determining the distribution.

1. ^ McCullagh, P. (1954). Does the moment generating function characterize a distribution? American Statistics, 48, p.208
2. ^ Waller, L.A. (1995). Does the characteristic function numerically distinguish distributions? American Statistics, 49, pp. 150-151

Even from a mathematician's point of view, don't you need the MGF to be infinitely differentiable at 0 for uniqueness? Paulginz (talk) 14:45, 24 November 2009 (UTC)

## Purpose

This page doesn't seem to explain the purpose of the function. —Preceding unsigned comment added by 129.16.204.227 (talk) 13:07, 10 September 2009 (UTC)

• Hope your concerns are now satisfied. Apoorv020 (talk) 20:20, 10 October 2009 (UTC)

Thanks. Your edits very much improves the quality of the article. --129.16.24.201 (talk) 08:59, 10 November 2009 (UTC)

I would say that the topic of Moment Generating Functions falls under "Subject-specific common knowledge" so adding inline citations is just really unnecessary. Any intro college level probability course will cover this article as it exists now pretty exhaustively.
Mike409 (talk) 11:17, 3 February 2010 (UTC)
No such thing as "Subject-specific common knowledge" ... citations are always needed. Melcombe (talk) 13:56, 3 February 2010 (UTC)
• I added a section called "Why the moment-generating function is defined this way". If you want to rename that as "Purpose" go right ahead.

Kimaaron (talk) 21:34, 20 February 2010 (UTC)

## raw moments vs. central moments.

The MGF is calculating the raw moments as opposed to the central moments. It already says "moments around the origin" on the page, but perhaps this should be noted explicitly at the end of the introduction ("The moment generating function is named as such, because of its intimate link to the raw moments") and again in the definition (only inserting raw), f.ex. : ...where mn is the nth raw moment. Currently the first definition on the moment page is of central moments, so a bit of confusion could occur. /Jens — Preceding unsigned comment added by 203.110.235.1 (talk) 23:55, 21 January 2013 (UTC)