Jump to content

User:Hippasus/Sandbox

From Wikipedia, the free encyclopedia

In probability theory, the probability distribution of the sum of two independent random variables is the convolution of their individual distributions. Many distributions have well known convolutions. The following is a list of these convolutions. Each statement is of the form

where are independent and identically distributed.

Discrete Distributions[edit]

Continuous Distributions[edit]

  • .

Example Proof[edit]

There are various ways to prove the above relations. A straightforward technique is to use the moment generating function, which is unique to a given distribution.

Proof that [edit]

The moment generating function of each and of is

where t is within some neighborhood of zero.

The expectation of the product is the product of the expectations since each is independent. Since and have the same moment generating function they must have the same distribution.

See Also[edit]

References[edit]

  • Craig, Allen T. (2005). Introduction to Mathematical Statistics (sixth ed.). Pearson Prentice Hall. ISBN 0-13-008507-3. {{cite book}}: Unknown parameter |coauthors= ignored (|author= suggested) (help)