|WikiProject Mathematics||(Rated Start-class, Mid-importance)|
|WikiProject Statistics||(Rated Start-class, Mid-importance)|
- 1 Diagram
- 2 Simple explanation
- 3 Incorrect Interpretation?
- 4 Unfavourable Example values?
- 5 Real-World Example
- 6 joint distribution and conditional distribution equal?
- 7 In the example, shouldn't it say that the _conditional_ probability distribution is being weighted by instead of the _joint_ probability distribution?
I find the diagram a little hard to relate to the text at the moment. I think it would be very helpful indeed if someone could add a small description of what the different coloured regions represent, for example.Jimjamjak (talk) 17:06, 16 December 2011 (UTC)
I guess that most readers are (like me) beginners at probability and statistics. It would be very good if someone could ad an easy pedagogical explanation as a supplement to the formula. Thank you! Aaker (talk) 18:48, 14 December 2008 (UTC)
- Yes, the explanation should be simpler. Their explanation basically took some very simple rudimentary statistics, and put it in terms that would be way to complex for anyone who didn't already know what Marginal distribution was.--Kodiak42 (talk) 01:06, 29 August 2012 (UTC)
How about this: The marginal distribution of x or g(x) is equal to the indefinite integral of f(x,y) (the probability distribution function) with respect to the y variable or =g(x) . I'm currently in Probability and stats for engineers and that is the way it was explained to me. --220.127.116.11 (talk) 17:48, 1 April 2014 (UTC)
Surely it should be interpreted as a 28.5% chance of getting hit if you *ignore* the traffic light rather than walk across the road blindfold?!
- yes, otherwise the new conditions alter the original probability distribution. I've updated the example. —Preceding unsigned comment added by 18.104.22.168 (talk) 20:09, 16 May 2010 (UTC)
Unfavourable Example values?
Thanks for the "Real World Example". However, the values for the conditional probabilites sum up to one:
P(H=hit|L=red) = 0.01, P(H=hit|L=yellow) = 0.09, P(H=hit|L=green) = 0.9
Although this might be the case, in general I think i won't (e.g. assume very alert drivers, when one is only hit with probability of 0.5 although the lights are green; or Saturday night rushour, when drunk drivers hit any pedestrian crossing). So, for the sake of clarity, should the values in the example be modified? Please correct me if I am wrong. Melwin Quacke (talk) 12:20, 24 November 2009 (UTC)
- You're correct, only the columns of the conditional probability table should add up to 1. updated. —Preceding unsigned comment added by 22.214.171.124 (talk) 21:03, 16 May 2010 (UTC)
Thumbs up for the "Real-World Example". I wish every theoretical WP article had such clear examples. Beginners can very easily get lost in jargon. The example really helps readability. —Preceding unsigned comment added by 126.96.36.199 (talk) 00:40, 11 January 2010 (UTC)
The discussion on the conditional probability of being hit depending on the state of the lights says - "You are, for example, far more likely to be hit by a car if you try to cross while the lights for cross traffic are green than if they are red." Yet the table of probabilities ( second row gives )
G Y R H=Hit 0.01 0.1 0.8
That's inconsistent. since in the data, the probability of getting hit when the lights are green is 0.01 which is a lot less than 0.8 ( lights are red )..Either the data has to be relabeled, or the text has to invert red and green or the idea of lights for the cross traffic ( that is lights for the orthogonal traffic, which are opposite to you ) made clearer. I'd opt for inverting and drop the phrase "cross traffic" saying instead - and the "lights for the traffic for the road you are trying to cross." or something like that. Skthetwo (talk) 20:26, 27 January 2012 (UTC)
I had the same issue with the inconsistency of the semantics vs. numerical values for the different light colors. Changed the page to make them match but in the opposite way from what Skthetwo described. I kept the interpretation "green light = green cross traffic = pedestrian should NOT cross" and changed the values to reflect that. Still seems like a confusing model since it describes the pedestrian in terms of the cross traffic, but also no pedestrian lights are Red/Yellow/Green, so I didn't want to change it to that. Thus I kept the talk section up in case someone has a more elegant idea. Lily.r.s (talk) 01:19, 2 June 2014 (UTC)
The real world example is a big help at understanding what Marginal Distributions are. However, I am still a bit lost as to how to interpret the numbers at the end of the example. In the current example, 0.572 is the marginal probability on the "hit" row. So, can we say that based on this example, the probability of a pedestrian being hit if they don't pay attention to the condition of the traffic light is 0.572? If a knowledgable person reads this and agrees, can you add an accurate interpretation sentence to the example portion of the article? — Preceding unsigned comment added by 188.8.131.52 (talk) 18:34, 3 February 2015 (UTC)
joint distribution and conditional distribution equal?
There is something I don't understand in this explanation: why are the joint distribution and conditional distribution equal here? Nova77 05:14, 11 Feb 2005 (UTC)
- They're not. I've looked at old versions and I can't find what makes you think they are. Michael Hardy 19:57, 7 May 2005 (UTC)
In the example, shouldn't it say that the _conditional_ probability distribution is being weighted by instead of the _joint_ probability distribution?
Very nicely explained article, especially as it includes an actual worked example. Regarding this example, I have a comment. In the example, it says: "So in this case the answer for the marginal probability can be found by summing P(H,L) for all possible values of L, with each value of L weighted by its probability of occurring."
But shouldn't it be "", i.e. the conditional probability distribution? At least in the formula for only the conditional probability distribution is weighted by the probability of . The joint probability distributions is simply summarized, without any weights.