Jump to content

Talk:Expected value

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 24.91.51.31 (talk) at 20:22, 29 January 2012 (critical comment on overy technical wording). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

WikiProject iconStatistics Start‑class Top‑importance
WikiProject iconThis article is within the scope of WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.
TopThis article has been rated as Top-importance on the importance scale.
WikiProject iconMathematics Start‑class Top‑priority
WikiProject iconThis article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.
TopThis article has been rated as Top-priority on the project's priority scale.

Archived discussion to mid 2009

what is it with you math people and your inability to express yourselves ??

I have a PhD in molecular biology, and I do a much better job of explaining complex stuff. This article is - and am being fact based here - a disgrace. The opening paragraph should read something like Expected value is the average or most probable value that we can expect, for instance if we toss 2 fair dice, the expected value is ( think about 6 no); if we toss 100 fair pennies... You people need some fair, but harsh criticism; I'm not enough of a math person, but someone should step up , this article like most on math based subjects is not an ENCYCLOPEDIA article; it is for advance math students. I really mad at you people - you have had long time to work on this and have done a bad job I don't care if you have PhDs in math, or are tenured profs at big shot univeristys: you deserve all the opprobrium i, heaping on you instead of dismissing me, why don't you ask a historian or english major or poet or poli sci person what they think

Proposition to add origin of the theory

I propose to add the following in a separate chapter:

Blaise Pascal was challenged by a friend, Antoine Gombaud (self acclaimed “Chevalier deM\´er\´e” and writer), with a gambling problem. The problem was that of two players who want to finish a game early and, given the current circumstances of the game, want to divide the stakes fairly, based on the chance each has of winning the game from that point. How should they find this “fair amount”? In 1654, he corresponded with Louis de Fermat on the subject of gambling. And it is in the discussion about this problem, that the foundations of the mathematical theory of probabilities are laid and the notion expected value introduced.

--->PLEASE let me know (right here or on my talk page) if this is would not be ok otherwise I plan to add it in a few days<---

Phdb (talk) 14:00, 24 February 2009 (UTC)[reply]

Yes please add. In fact, for quite some time the concept of expectation was more fundamental than the concept of probability in the theory. Fermat and Pacal never mentioned the word "probability" in their correspondence, for example. The probability concept then eventually emerged from the concept of expectation and replaced it as the fundamental concept of the theory. iNic (talk) 01:03, 26 February 2009 (UTC)[reply]
Laplace used the term “hope” or “mathematical hope” to denote the concept of expected value (see [1], ch.6):
I wonder what name was used by Pascal (if any), and where did “expectation” came from?  … stpasha »  19:37, 24 September 2009 (UTC)[reply]

Strange

I not very familiar with statistics, and possibly this is why I can't see any sense in counting the numbers written on the sides of the die. What if the sides of the die would be assigned signs without defined alphanumerical order? What would be the "expected value"? --85.207.59.18 (talk) 12:14, 11 September 2009 (UTC)[reply]

In that case you can't really speak of a "value" of a certain side. If there's no value to a side, it's impossible to speak of an expectation value of throwing the die. The concept would be meaningless. Gabbe (talk) 15:52, 22 January 2010 (UTC)[reply]
Of course, you could assign values to the sides. The sides of a coin, for example, do not have numerical values. But if you assigned the value "-1" to heads and "1" to tails you would get the expected value
Similarly, if you instead gave heads the value "0" and tails the value "1" you would get
and so forth. Gabbe (talk) 20:34, 22 January 2010 (UTC)[reply]

Upgrading the article

At present (Feb 2010) the article is rated as only "Start", yet supposedly has "Top" priority and is on the frequent-viewing lists. Some initial comments on where things fall short are:

  • The lead fails to say why "expected value" is important, either generally or regarding its importance as underlying statistical inference.
  • There is a lack of references, both at a general-reader level and for the more sophisticated stuff.
  • There is some duplication, which is not necessarily a problem, but it would be if there were cross-referencing of this.
  • There is a poor ordering of material, in terms of sophistication, with elementary level stuff interspersed.

I guess others will have other thoughts. Particularly for this topic, it should be a high priority to retain a good exposition that is accessible at an elementary level, for which there is good start already in the article. Melcombe (talk) 13:17, 25 February 2010 (UTC)[reply]

Misleading

"The expected value is in general not a typical value that the random variable can take on. It is often helpful to interpret the expected value of a random variable as the long-run average value of the variable over many independent repetitions of an experiment."

So the expected value is the mean for repeated experiments (why not just say so?), and yet you explicitly tell me that it is "in general not a typical value that the random variable can take on". The normal distribution begs to disagree. Regardless of theoretical justifications in multimodal cases, this is simply bizarre. More jargon != smarter theoreticians. Doug (talk) 18:33, 21 October 2010 (UTC)[reply]

What is the problem? The expected value is in general not a typical value. In the special case of the normal distribution it really is, who says otherwise? In the asymmetric unimodal case it is different from the mode. For a discrete distribution is it (in general) not a possible value at all. Boris Tsirelson (talk) 19:22, 21 October 2010 (UTC)[reply]
(why not just say so?) — because this is the statement of the law of large numbers — that when the expected value exists, the long-run average will converge almost surely to the expected value. If you define the expected value as the long-run average, then this theorem becomes circularly-dependent. Also, for some random variables it is not possible to imagine that they can be repeated many times over (say, a random variable that a person dies tomorrow). Expected value is a mathematical construct which exists regardless of the possibility to repeat the experiment.  // stpasha »  02:21, 22 October 2010 (UTC)[reply]

Proposition for alternative proof of

I tried to add the proof below which I believe to be correct (except for a minor typo which is now changed). This was undone because "it does not work for certain heavy-tailed distribution such as Pareto (α < 1)". Can someone elaborate?

Alternative proof: Using integration by parts

and the bracket vanishes because as . —Preceding unsigned comment added by 160.39.51.111 (talk) 02:50, 13 May 2011 (UTC)[reply]

Actually the end of section 1.4 seems in agreement, so I am reinstating my changes —Preceding unsigned comment added by 160.39.51.111 (talk) 02:54, 13 May 2011 (UTC)[reply]
I have removed it again. The "proof" is invalid as it explicitly relies on the assumption as which does not hold for all cdfs (e.g. Pareto as said above). You might try reversing the argument and doing an integration by parts, starting with the "result", which might then be shown to be equavalent to the formula involving the density. PS, please sign your posts on talk pages. JA(000)Davidson (talk) 09:40, 13 May 2011 (UTC)[reply]
Let's try to sort this out: I claim that whenever X nonnegative has an expectation, then (Pareto distribution when alpha < 1 doesn't even have an expectation, so this is not a valid counter-example)
Proof: Assuming X has density function f, we have for any
Recognizing and rearranging terms:
as claimed.
Are we all in agreement, or am I missing something again? Phaedo1732 (talk) 19:05, 13 May 2011 (UTC)[reply]
Regardless of the validity of the proof, is an alternative proof a strong addition to the page? CRETOG8(t/c) 19:12, 13 May 2011 (UTC)[reply]
I think so, because the current proof is more like a trick than a generic method, whereas the alternative proof could be generalized (as shown in Section 1.4) I also think the point of an encyclopedia is to give more information rather than less. Phaedo1732 (talk) 00:49, 14 May 2011 (UTC)[reply]
See 6 of WP:NOTTEXTBOOK, and WP:MSM#Proofs. This doesn't seem to be a place that needs a proof at all. What is needed is a proper citation for the result, and a proper statement of the result and its generalisation to other lower bounds. {I.e. the result could be used as an alternative definition of "expected value", but are the definitions entirely equivalent?)JA(000)Davidson (talk) 08:28, 16 May 2011 (UTC)[reply]
Clearly the previous editor of that section thought a proof should be given. If anyone comes up with a good citation, I am all for it. Phaedo1732 (talk) 15:31, 16 May 2011 (UTC)[reply]

Simple generalization of the cumulative function integral

Currently the article has the integral

for non-negative random variables X. However, the non-negativeness restriction is easily removed, resulting in

Should we give the more general form, too? -- Coffee2theorems (talk) 22:33, 25 November 2011 (UTC)[reply]

But do not forget the minus sign before the first integral. Boris Tsirelson (talk) 15:47, 26 November 2011 (UTC)[reply]
Oops. Fixed. Anyhow, do you think it would be a useful addition? -- Coffee2theorems (talk) 19:29, 4 December 2011 (UTC)[reply]
Yes, why not. I always present it in my courses.
And by the way, did you see in "general definition" these formulas:
  • if ,
  • if .
I doubt it is true under just this condition. Boris Tsirelson (talk) 07:26, 5 December 2011 (UTC)[reply]
Moreover, the last formula is ridiculous:
if Pr[X ≥ 0] = 1, where F is the cumulative distribution function of X.
Who needs the absolute value of X assuming that X is non-negative? Boris Tsirelson (talk) 07:30, 5 December 2011 (UTC)[reply]

"Expected value of a function" seems to be misplaced

Does the text starting with "The expected value of an arbitrary function of ..." really belong to the definition of the expectation, or would it be better to move it to Properties, between 3.6 and 3.7, and give it a new section (with which title?)? I am not entirely sure, but I think one can derive the expected value of a function of a random variable without the need for an explicit definition. After all, the function of a random variable is a random variable again; given that random variables are (measurable) functions themselves, it should be possible to construct $E(g(X))$ just from the general definition of $E$. Any thoughts? Grumpfel (talk) 21:54, 29 November 2011 (UTC)[reply]