Talk:Normally distributed and uncorrelated does not imply independent

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Statistics (Rated Start-class, Low-importance)
WikiProject icon

This article is within the scope of the WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page or join the discussion.

Start-Class article Start  This article has been rated as Start-Class on the quality scale.
 Low  This article has been rated as Low-importance on the importance scale.

Is this example really simpler than my second example? My second example is

X is Gaussian
Z, independent from X, is -1 or 1 each with probability 0.5
Y = XZ

Again, X and Y are both normal and uncorrelated, but they are not jointly normal and not independent. — ciphergoth 07:12, 2005 Apr 29 (UTC)


I think the paragraph outlining the interpretation of uncorrelation on a set of normally distributed random variables is a little confusing. What about something similar to the section on Correlations and independence in the multivariate normal distribution article.

In general, random variables may be uncorrelated but highly dependent. But if a random vector has a multivariate normal distribution then any two or more of its components that are uncorrelated are independent. This implies that any two or more of its components that are pairwise independent are independent. For example, suppose two random variables X and Y are jointly normally distributed; that is, the random vector (XY) has a multivariate normal distribution. This means that the joint probability distribution of X and Y is such that for any two constant (i.e., non-random) scalars a and b, the random variable aX + bY is normally distributed. In this case, if X and Y are uncorrelated, i.e., their covariance cov(XY) is zero, then they are independent.
However, it is not true that two random variables that are (separately, marginally) normally distributed and uncorrelated are independent. Two random variables that are normally distributed may fail to be jointly normally distributed, i.e., the vector whose components they are may fail to have a multivariate normal distribution. (talk) 17:57, 24 July 2008 (UTC)

The example?[edit]

I guess I don't understand this:

How can Y be normally distributed? It's definitely a nonlinear transform except Y has no support on (-infy, -c) and (0,c). What am I missing? Cburnett 14:34, Apr 29, 2005 (UTC)

P(Y < a) = P(X < a & |X| > c) + P(X > −a & |X| < c) = P(X < a & |X| > c) + P(X < a & |X| < c) = P(X < a)
On the other hand, if it's not clear even to you that Y is normally distributed in this example, that suggests that my example is better :-) — ciphergoth 16:17, 2005 Apr 29 (UTC)
Y is indeed normally distributed. Its support does include (−∞, −c), since it is in that inteval whenever X is in that interval (and it is then equal to X). Also, it is in the interval (0, c) whenever X is in the interval (−c, 0). That it is normally distributed is a pretty simple exercise. Michael Hardy 20:25, 29 Apr 2005 (UTC)

Oh. Oops. I had somehow completely missed that it was comparing abs(X) instead of X (and so thought that it was folding on one side only).


At the moment, this article has no references or sources. It has the appearance of original research, which is bad for an article which basically says "forget your preconceptions about the subject matter, they're wrong!" (Note that I'm not contesting the subject matter, which I completely agree with, merely the lack of references.) Oli Filth 08:24, 26 June 2007 (UTC)

I'm less inclined to sympathize with claims of "original research" in cases like this where the correctness of the results can easily be checked in a minute. However, these results are certainly out there in the literature. I'll see if I can find referencecs. Michael Hardy 17:54, 26 June 2007 (UTC)

Simpler examples[edit]

X ~ N(0,1); Cov(X,X^2) = 0, yet X and X^2 are clearly not independent —Preceding unsigned comment added by (talk) 17:01, 20 February 2008 (UTC)

However, X^2 isn't normally distributed. Oli Filth(talk) 20:04, 20 February 2008 (UTC)


Maybe this article should be merged with normal distribution, multivariate normal, or independence or something similar. Just throwing it out there.--Fangz (talk) 14:57, 16 May 2008 (UTC)

Well, those pages ought to link to this one, and I see that they don't, so I'll take care of that. Michael Hardy (talk) 16:39, 16 May 2008 (UTC)
Wait: they do link to it. I was looking at links to this discussion page. Michael Hardy (talk) 16:40, 16 May 2008 (UTC)
Merge. This is just a sub-topic of Statistical independence. In addition the explanations and examples are (probably, no refs) WP:OR. (And the title reminds me of a Why buy the cow when you can get the milk for free article referenced from Chastity  :-) Saintrain (talk) 21:05, 22 July 2008 (UTC)

They're obviously well-known examples. I'll see if I can dig up references. It could be called a "subtopic of normal distribution" as well, so if it needs to get merged, it's not clear that that's what it should be merged into. Michael Hardy (talk) 22:39, 22 July 2008 (UTC)

...OK, I've added a reference for one of the examples. Michael Hardy (talk) 23:00, 22 July 2008 (UTC)

Section headings[edit]

A mathematical error occurred in some recent edits as a result of someone's failure to notice that the two examples in the bulleted list were in fact TWO SEPARATE examples. Therefore I've created subsection headings to avoid such confusion in the future. Michael Hardy (talk) 19:03, 22 August 2008 (UTC)

Misleading title[edit]

I would suggest that the name of the page be edited to say "Marginally normally distributed..." instead of "Normally distributed...". The reason is that the title is ambiguous, since it could refer to a joint normal distribution, which does imply independence. I don't know how to change this myself, so I will leave it to someone else. SCF71

I agree that the title is misleading--I think that most people, when they read the phrase "normally distributed" in a multivariate context, especially in the space-constrained context of an article title, will assume that this is intended as space-saving shorthand for "jointly normally distributed." And the problem is that the intent is not clarified until sentences 8 and 9 of the article.
However, adding "Marginally" to the beginning of the title would (a) make the already long title too long, and (b) cause the article not to pop up below the search box when one types in "Normally." So I propose clarifying this by extending sentence 4 from "However, this is incorrect." to "However, this is incorrect if the variables are merely marginally normally distributed but not jointly normally distributed." Duoduoduo (talk) 15:12, 4 June 2010 (UTC)
Seeing no objection, I've done this clarification. Duoduoduo (talk) 14:23, 9 June 2010 (UTC)


This needs a graph to show what's going on. —Ben FrantzDale (talk) 16:01, 29 July 2009 (UTC)

Perhaps a picture of a person's nametag as "Mohammed Smith"? ZtObOr 16:37, 20 July 2012 (UTC)
I've removed the diagram request template, as the page now has two diagrams. —Granger (talk · contribs) 20:38, 17 October 2014 (UTC)

Incorrect diagram[edit]

I don't think the diagram matches the equations in the text (just need to multiply Y by -1 and you have it). Could this be changed? — Preceding unsigned comment added by Bakerccm (talkcontribs) 17:23, 21 September 2012 (UTC)

With reference to the diagram in the asymmetric example section, I agree: the section gives
X and Y.
Clearly in the graph Y=X for small values of X, not large values. Since I don't know how to correct the graph, I'll change the example and the proof that goes with it. Duoduoduo (talk) 18:29, 12 May 2013 (UTC)

Step from `E(XY)-0 → E(E(XY|W))` is insufficiently explained[edit]

At least in terms of the notation, I have no idea what the two different E's are supposed to intend, because from my first expectation of what that meant, it should be idempotent if it has no way of determining what it is supposed to be summing over and averaging, and thus from that interpretation no matter how many times you apply E, it is going to be left unchanged after the first application.

I think what you might have meant was something like:

And even that I'm not happy with.

The simple idea is that if we are to treat the two expectations differently, then we need some way to designate them. And if in some way that is actually supposed to reduce to what I thought would have to be using the marginalizing out of the auxilliary variable W(as if it were observed), giving :

I realize being this explicit with notation can be tedious for some, but that is absolutely necessary to convey exactly what is meant especially when variable free operators are assumed to be idempotent(or at least need not rely on a lack of idempotency for its proper interpretation).

The fact that the first expectation is an average of terms of some other function taken with respect to , which will then be defined conditional on each w, which happens to be in terms of the valence of the relationship between X and Y, and then summed over.

I find all of this reasoning to be relevant to the discussion, but I do not know how much of this should be included in the actual article. I figure the equation at the least would make for an improvement, but I wasn't sure if this argument accords with the central tenets of wikipedia as interpreted around these parts.

If the goal isn't to convey knowledge efficiently, accurately, and as completely as possible to people in a way that is publicly verifiable, then I don't know what it is, but it seems like keeping the current equation helps no one and hurts some people, because even experts who could figure it out would then not need to do so in order to immediately comprehend what it meant.

Mdpacer (talk) 07:48, 24 June 2015 (UTC)