# Talk:Moment (mathematics)

WikiProject Statistics (Rated C-class, High-importance)

This article is within the scope of the WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page or join the discussion.

C  This article has been rated as C-Class on the quality scale.
High  This article has been rated as High-importance on the importance scale.
WikiProject Mathematics (Rated C-class, Mid-importance)
This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of Mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Mathematics rating:
 C Class
 Mid Importance
Field:  Analysis

## Untitled

If no one objects in the next few days, I will move this page to moment (mathematics) so moment (physics) can be moved here. Even the first sentence of this article makes it clear that the physics definition takes precedence. moink 22:24, 26 Dec 2003 (UTC)

I object on the grounds that most pages that might link to moment ought to go to moment (mathematics) and not to moment (physics). I have moved the article to moment (mathematics) and made this a disambiguation page. After that I changed the various links to this page. A few I directed to moment (physics) but most to moment (mathematics); the physics article would have been grossly inappropriate. Michael Hardy 22:56, 26 Dec 2003 (UTC)
Okay, but now that you've fixed the links, it should be better. I still think physics is the primary one. We'll put a link to the mathematics article. moink 05:23, 27 Dec 2003 (UTC)

## skewness and kurtosis

An editor put these in the article

Attention!!! There is an error. The third central moment is not a skewness, please refer to skewness.
Attention!!! There is an error. The fourth central moment is not a kurtosis, please refer to kurtosis.

I believe I understood his point, and made some corrections. Tom Harrison (talk) 02:35, 29 November 2005 (UTC)

It seems that according to the articles on skewness and kurtosis that skewness and kurtosis are defined incorrectly here. --Justind 00:50, 14 September 2006 (UTC)

The definitions here seem identical to those in skewness and kurtosis: the third and fourth cumulants normalized ("normalized" means divided respectively by the third and fourth powers of the standard deviation; the third cumulant is the third central moment; the fourth cumulant is NOT the fourth central moment). Michael Hardy 02:31, 14 September 2006 (UTC)

## legibility

On many occasions I've commented on the relative legibility of the following two things, differing both in spacing and in the length of the minus sign:

x − t
x-t

But now another issue arises:

The kurtosis κ is defined to be the normalized fourth central moment minus 3.
The kurtosis κ is defined to be the normalized fourth central moment - 3.

In the second of these expressions, the hyphen used as a minus sign could be mistaken for a dash, so it's like saying:

The kurtosis κ is defined to be the US president - George W. Bush.

In other words, it's as if it said the kurtosis is 3. Michael Hardy 23:04, 31 March 2006 (UTC)

## What does this mean?

The inequality can be proven by considering
${\displaystyle \operatorname {Exp} ((t^{2}-at)^{2})}$
where t = (x − μ)/σ.

What does the above mean? "exp" with a lower-case e usually means the exponential function, exp(u) = eu. I suspect what was intended was "expected value". But a simple "E" is what is usually used for that. And I realize using a lower-case letter as a random variable is done by some respectable writers, and more so if you go back 75 years or so, but one should be stylistically consistent. It helps avoid confusion. Michael Hardy 23:27, 31 March 2006 (UTC)

Hello, can someone please elucidate this proof? I tried to work through it, but it seems to prove something different (namely, that the square of the expectation value of t^3 is less than or equal to the product of the expectation value of t^2 and the expectation value of t^4). Some additional hints here by an expert (or a retraction, if incorrect) would be appreciated. By the way, the result is correct (it is proved elsewhere, using methods that are much more complicated than what is suggested here; see for example Minimal Variance and its Relation to Efficient Moment Tests; I am simply questioning the line of reasoning suggested for the proof.--Fizzbowen (talk) 07:53, 17 November 2009 (UTC)

The formula for the expectation was indeed incorrect. To produce the desired inequality, the expression E((T^2 - aT)^2) should instead have been E((T^2 - aT - 1)^2). I just corrected the page. Patrug (talk) 09:28, 16 June 2011 (UTC)patrug

## Moment Generating function

I think that we should mention moment generating function in the article. m(t)=E(e^tX)=...

Jackzhp 15:23, 3 September 2006 (UTC)

## Isn't the definition too narrow?

I think that the definition of a moment presented here is too narrow. There exist certain concepts, like Zernike moments and Legendre moments, which do not match this definition, because:

• They can be complex, not real (Zernike moments)
• They are related to multi-variable functions (or at least functions of 2-variables f(x,y) )
• the function used inside the integral is not a simple polynomial ${\displaystyle (x-c)^{n}\,}$, but a Zernike polynomial, or Legendre polynomial.

Now, are these concepts formally moments or not?

As for me, the moment is the coefficient of expansion of certain function ${\displaystyle f\,}$ using some class of parametrised functions ${\displaystyle g_{n,m,...}\,}$ such that:

${\displaystyle m_{i,\ldots ,k}=\int f(x,\ldots )g_{i,\ldots ,k}(x,\ldots )dx\ldots }$
${\displaystyle f(x,\ldots )=\sum _{i=0}^{\infty }\ldots \sum _{k=0}^{\infty }m_{i,\ldots ,k}g_{i,\ldots ,k}(x,\ldots )}$

Now, this nicely goes with what various authors refer to as moments. Can anybody justify this? Gutek 11:17, 15 October 2006 (UTC)

It seems the page as written is about the canonical, if you will, definition of moments. The others you mention are certainly real usages of the term. But with the exception of multivariate moments, I have always interpreted the other usages (Zernike, etc) to imply these are related to the concept of moments, or have some similarities with them. I have been working with a paper that defines something called "complex moments", but a more precise if unwieldy definition would be "particular complex quantities which are functions of (canonical) moments".
My recommendation would be to keep this page's scope close to what it is, but perhaps allow for multivariate moments (e.g., f(x,y) at top). Then perhaps have an "Other usage" section, with links to specific usages as appropriate. Just my thoughts Baccyak4H 14:11, 15 October 2006 (UTC)
Agreed. An "other usage" section could be little more to a separate "generalized moments" article, should one be written. McKay 15:57, 18 October 2006 (UTC)

## Distinguish between RAW (about 0) and CENTRAL moments (about the mean)

WHY IS IT NO ONE SAYS ANYTHING ABOUT RAW MOMENTS!!! ALL THIS ARTICLE TALKS ABOUT IS THE CENTRAL MOMENTS!!! TRY RAW MOMENTS!!! —Preceding unsigned comment added by Heero Kirashami (talkcontribs) 21:19, 27 October 2007 (UTC)

No, it is worse than that, there's an erroneous ambiguity between the two throughout the article. EX, or expected value, or mean, is the first RAW moment, not the first (central) moment - the first central moment is ZERO. Once we have the mean, we can then derive the variance (DX), which is the second CENTRAL moment. Please refer to wiki article on Central moments. PLEASE FIX IT BY MAKING SURE YOU DISTINGUISH BETWEEN THE TWO, this is IMPORTANT! 91.193.159.70 (talk) 20:04, 10 September 2010 (UTC)

## Possible split

Given that this was classed as in the "analysis" field of mathematics, rather than in "probability and statistics", and given that someone has complained that it contains too much about statistical applications while I would suggest that somewhat more is needed, it may be best to devise a split so that the pure-mathematics part can be left to those who want it. Thus possibly create a "moment (statistics)" article? Melcombe (talk) 10:22, 27 March 2008 (UTC)

## Partial Moments

The definition for partial moments read

${\displaystyle \mu _{n}^{-}(r)=\int _{-\infty }^{r}(\max\{x-r,0\})^{n}\,f(x)\,dx,}$
${\displaystyle \mu _{n}^{+}(r)=\int _{r}^{\infty }(-\min\{x-r,0\})^{n}\,f(x)\,dx.}$

which seems to be a conflation of two different definitions, but somehow swapped, since the effect of the max/min operator with the integration limits is to cause the result to always be zero. It seems clear that the "right" definition must be

${\displaystyle \mu _{n}^{-}(r)=\int _{-\infty }^{r}(r-x)^{n}\,f(x)\,dx,}$
${\displaystyle \mu _{n}^{+}(r)=\int _{r}^{\infty }(x-r)^{n}\,f(x)\,dx.}$

so I changed it to that, but it would be good if someone could check an actual reference. 76.201.140.116 (talk) 08:39, 13 August 2008 (UTC)

## Confusion with physics

I'm confused. In physics, the 3x3 moment of inertia matrix looks like this:

${\displaystyle \sum _{i=1}^{n}{\begin{bmatrix}y_{i}^{2}+z_{i}^{2}&-x_{i}y_{i}&-x_{i}z_{i}\\-y_{i}x_{i}&x_{i}^{2}+z_{i}^{2}&-y_{i}z_{i}\\-z_{i}x_{i}&-z_{i}y_{i}&x_{i}^{2}+y_{i}^{2}\end{bmatrix}}}$

but on this page, it looks like the moment is simply the matrix

${\displaystyle \sum _{i=1}^{n}{\begin{bmatrix}x_{i}^{2}&x_{i}y_{i}&x_{i}z_{i}\\y_{i}x_{i}&y_{i}^{2}&y_{i}z_{i}\\z_{i}x_{i}&z_{i}y_{i}&z_{i}^{2}\end{bmatrix}}}$

Could someone explain what's going on? The latter looks more like a covariance matrix, which would have its principal eigenvector along the major axis of the corresponding point cloud whereas the prior looks like a moment, with the principal eigenvector in the direction that maximizes the spread around that direction. What's going on here? Thanks. —Ben FrantzDale (talk) 18:04, 14 August 2008 (UTC)

I found a page that, at the very end, discusses this a little bit. —Ben FrantzDale (talk) 20:40, 14 August 2008 (UTC)
I posted some more at Talk:Moment_of_inertia#Analogy_with_covariance_matrix. —Ben FrantzDale (talk) 12:09, 11 September 2008 (UTC)

## Unbiasedness

The following statement may confuse readers: "It can be shown that the expected value of the sample moment is equal to the k-th moment of the population, if that moment exists, for any sample size n. It is thus an unbiased estimator." It would be nice to add the very short proof and a statement on the contrast with the central moments. Cerberus (talk) 20:07, 10 February 2010 (UTC)

## How about a definition in English?

Seeing as this is the English language Wikipedia and all, is it possible to have a definition which can be read aloud in English, which doesn't have a mathematical equation in the middle of it?

I agree - this article is demonstrative of the fatal flaw of most wikipedia math articles: they are completely useless unless you already know the subject at hand. I understand that Wikipedia is an encyclopedia, rather than an introductory textbook, but I believe significant effort should be made to present the material in a way that can be understood by those seeking to learn the subject, rather than to just remind themselves of it. —Preceding unsigned comment added by 67.127.229.86 (talk) 16:36, 27 February 2010 (UTC)

I don't think this article is all that bad, but the introduction should be a little more general, and it probably shouldn't dive right into an equation without explaining it. And if there is an equation in the introduction, then it should be one that is general enough to be understandable without a huge amount of math background. Rhetth (talk) 23:17, 6 April 2010 (UTC)
I took a stab at rephrasing the intro. This topic should be fairly approachable as it is a very physically-meaningful idea. Diagrams would help a lot. Particularly showing distributions with the same moments but different shapes and different moments but similar shapes, similar to this great example: File:Correlation_examples.png. —Ben FrantzDale (talk) 12:49, 7 April 2010 (UTC)
Thank you, BenFrantzDale. —Pengo 04:04, 9 April 2010 (UTC)
Thanks for trying to improve the intro, but it still doesn't make any sense to someone that does not already know the subject. I have a degree in math, but never studied this topic. I do not understand this intro. How about a specific definition of the word 'Moment' as used in mathematics? For example: "The Nth Moment of a distribution is the expected value of the Nth power of the deviations from a fixed value" (from Wordnet Dictionary). I'd really like to see a sentence that begins with: "A 'Moment' is ..." that is then as succinct as possible while remaining accurate, but using NO formulas. To me this would be a useful introduction, then the details would be covered in the rest of the article. - Madmax — Preceding unsigned comment added by 63.240.143.147 (talk) 17:02, 17 August 2011 (UTC)

Thanks all of you who work on this. The first few sentences seem to make sense (to me - calc II is the extent of my background), but then the figure loses me. Might you make the figure legend more explicit? Or a link to a more basic explanation elsewhere. radcen (talk) 18:00, 21 December 2011 (UTC)

I'm not surprised that the figure loses you. I only understand it because my specialty overlaps this, otherwise I'm pretty sure I wouldn't have a clue. I suggest the figure be removed. McKay (talk) 03:45, 22 December 2011 (UTC)

## Joint Moments

Just as for cumulants, where joint cumulants are used for collections of random variables, it would be helpful to add a new section to define joint moments in this page. Yongtwang (talk) 13:54, 13 May 2010 (UTC)

what about cross moments (see for example [1]), are these the same as joint or as mixed moments? --AMH-DS (talk) 14:52, 3 May 2011 (UTC)

## Higher Moments

It would be nice to have a reference justifying the assertion that "the higher the moment, the harder it is to estimate". —Preceding unsigned comment added by 194.117.40.134 (talk) 11:03, 2 September 2010 (UTC)

Yes, perhaps it would be nice, I agree. However, this is fairly clear, through the "degrees of freedom" concept in statistics. As further moments are added, degrees of freedom are "consumed" in the process. The explanation of this concept would be great, but I suspect is likely somewhere within Wikipedia. So, without re-inventing the wheel, maybe just a link to degrees of freedom. In fact, now that I'm saying this, let me look for any internal links.  :) This.is.mvw (talk) 17:18, 19 August 2013 (UTC)

## Alternative notation

Would worth mentioning the alternative mathematical notation of the mean (and all moments) which implies putting an overline over the variable:

${\displaystyle {\bar {x^{n}}}=\int _{-\infty }^{\infty }x^{n}\,dF(x)\,}$.

This could be added as a note to the mean article. (if you know please edit this so the bar will appear longer like this :x^n )

## Geometric Moments

ie

exp(E log|X|^m)

worth adding? Domminico (talk) 19:15, 29 April 2012 (UTC)

## When was the word 'moment' first used in physics and mathematics?

To understand the deeper meaning of the word moment it is necessary to find out when was the word 'moment' first used in physics and in what sense. Also, it is important to find out when it was first used in mathematics in the form such as in https://en.wikipedia.org/wiki/Moment_(physics). Maybe, historians of physics and mathematics would be able to better illustrate the matter.Bkpsusmitaa (talk) 16:20, 23 May 2014 (UTC)

The one in mathematics feels as if it was inspired by the one in physics. When Shakespeare wrote of "enterprises of great pith and moment", I think he was referring to something more akin to effects in the physical world than of something mathematical. However, it may also be the case that what the various quantities in physics called "moments" have in common is of a mathematical nature. Michael Hardy (talk) 16:43, 23 May 2014 (UTC)
Good question. According to my dictionary, the word comes from the Latin "momentum," meaning movement. And in fact, the movement patterns of a physical object can be approximated by its moments. Then according to http://statistics.about.com/od/Descriptive-Statistics/a/What-Are-Moments-In-Statistics.htm , the term indeed spread from physics to mathematics & statistics. I don't know "when" the current definitions were first introduced, but hope this helps a bit. —Patrug (talk) 17:41, 23 May 2014 (UTC)

## Incorrect information in the headline re: center of mass?

"In mathematics, a moment is a specific quantitative measure, used in both mechanics and statistics, of the shape of a set of points. If the points represent mass, then the zeroth moment is the total mass, the first moment is the center of mass, and the second moment is the rotational inertia."

1. If the points represent DENSITY (not mass), then the 0th moment is the total mass. Not if they represent mass. I'm assuming here that the "set of points" are the function f(x) described in the definition of the moment in section (1). 2. The first moment is NOT the center of mass. The first moment is the center of mass times the total mass.

It confused me when I read this page and I wanted to point it out. I will edit it (if I don't run out of computer battery) but if I am incorrect please revert. — Preceding unsigned comment added by Doublefelix921 (talkcontribs) 15:52, 8 September 2014 (UTC)

## Assessment comment

The comment(s) below were originally left at Talk:Moment (mathematics)/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.

 The emphasis on applications in statistics doesn't help. This article should be about the concept, and its many applications. Geometry guy 23:20, 22 June 2007 (UTC) Suggest split into Moment (mathematics) and Moment (statistics). Melcombe (talk) 16:10, 1 April 2008 (UTC)

Last edited at 16:11, 1 April 2008 (UTC). Substituted at 20:07, 1 May 2016 (UTC)