Jump to content

Independence (probability theory): Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
No edit summary
Larry_Sanger (talk)
mNo edit summary
Line 1: Line 1:
back to [[Statistics/Assumptions]]



When we assert that two or more [[Random Variables]] are independent, we imply that probabilities of compound events involving these variables can be calculated by simply multiplying the probabilities of the individual events. This is expressed in many ways. The most general statement is:
When we assert that two or more [[Random Variables]] are independent, we imply that probabilities of compound events involving these variables can be calculated by simply multiplying the probabilities of the individual events. This is expressed in many ways. The most general statement is:


Line 27: Line 23:




back to [[Statistics/Assumptions]]




Revision as of 12:57, 29 June 2001

When we assert that two or more Random Variables are independent, we imply that probabilities of compound events involving these variables can be calculated by simply multiplying the probabilities of the individual events. This is expressed in many ways. The most general statement is:


  • Pr[(X in A) & (Y in B)] = Pr[X in A]*Pr[Y in B] for A and B any subsets of the independent sample spaces for X and Y.


In terms of joint and marginal probability densities, we find:


  • fXY(x,y)dx dy = fX(x)dx fY(y)dy where f represents a density and the indices on f indicate the random variable.


In terms of the Expectation Operator, we have:


  • E[X*Y] = E[X]*E[Y]


back to Statistics/Assumptions