Talk:Sum of normally distributed random variables

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Statistics (Rated Start-class, Low-importance)
WikiProject icon

This article is within the scope of the WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page or join the discussion.

Start-Class article Start  This article has been rated as Start-Class on the quality scale.
 Low  This article has been rated as Low-importance on the importance scale.
 

Hi! Recently articles on Wikipedia get so good that they can be used to base a literature review on. However, very often (like in this page) "real" literature references are missing. I assume this proof was done by someone else than the author. I would like to see references. Kind regards, Steven

In this case, the problem is which of the many references to use? One could just say "See any standard textbook on the subject", and it would be essentially correct, but probably doesn't count as a "reference". Michael Hardy 18:39, 3 August 2006 (UTC)

Product?[edit]

What about the product of normally distributed random variables? I found a document that discusses it which says that if V=XY then

f_V(v) = \int_{-\infty}^\infty f_{X,Y}(x,v/x)\frac{1}{|x|}\,dx.

But I'm having trouble finding the mean and variance of this distribution. (I suppose it might not even be normally distributed.) —Ben FrantzDale 04:57, 31 January 2007 (UTC)

This may have the answer: http://mathworld.wolfram.com/NormalProductDistribution.html —Ben FrantzDale 05:27, 31 January 2007 (UTC)

Case if the variables are correlated[edit]

For the case where the variables are correlated, I have given an outline of how to proceed with the derivation. Velocidex (talk) 02:03, 1 July 2008 (UTC)


You should also provide the covariance matrix, because the correlation coefficients are not clear. How do you get the term 2ρσxσy?. You should get 2ρ. Except if the cross correlation is ρσxσy. Energon (talk) 13:27, 16 June 2009 (UTC)

Also the article says 'whenever ρ < 1, then the standard deviation is less than the sum of the standard deviations of X and Y' - but the formula implies it should be greater (unless ρ < -1). Something wrong here. Ben Finn (talk) 20:09, 19 August 2011 (UTC)

Integrating the Dirac delta function[edit]

In the section Proof using convolutions, we might want to include a note that the Dirac function is constrained to satisfy the identity

\int_{-\infty}^\infty \delta(x) \, dx = 1.

and can thus be dropped. 192.91.171.42 (talk) 21:00, 14 April 2009 (UTC)

Geometric proof[edit]

Pretty sure there's an error here. c = \sqrt{ (Z/2)^2 + (Z/2)^2 } = Z/2\, should read c = \sqrt{ (Z/2)^2 + (Z/2)^2 } = Z/\sqrt{2}\, and  \text{erf}(Z/2) should read  \text{erf}(Z/\sqrt{2}). —Preceding unsigned comment added by 128.237.245.76 (talk) 13:15, 4 October 2010 (UTC)

Variance of mean[edit]

This seems like a good page to discuss the variance of the mean. Particularly, the variance of the sum is the sum of the variances and so the variance of the mean is the sum of the variances divided by n^2. Equivalently, the standard deviation of the mean is the standard deviation of the sum divided by n. —Ben FrantzDale (talk) 17:56, 13 June 2011 (UTC)

I don't think it belongs here -- I'll put it into Sample mean and sample covariance. I'm surprised it's not already there. Duoduoduo (talk) 18:58, 13 June 2011 (UTC)

Correlated random variables[edit]

The text says that when rho < 1, the standard deviation is less than than the sum of the standards deviations of X and Y. This should instead read rho < 0. — Preceding unsigned comment added by 210.11.58.37 (talk) 05:47, 14 November 2011 (UTC)