Jump to content

Distribution of the product of two random variables: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Line 20: Line 20:


== Derivation for independent random variables==
== Derivation for independent random variables==
If <math>X </math> and <math>Y </math> are two '''independent''', continuous random variables, described by probability density functions <math>f_X </math> and <math>f_Y </math> then the [[Joint_distribution#Continuous_case|joint distribution]] of <math>Z = XY</math> is<ref>{{cite book|last=Rohatgi|first=V. K.|title=An Introduction to Probability Theory and Mathematical Statistics|year=1976|publisher=Wiley|location=New York|isbn=978-0471731351|url=http://books.google.com/books?id=YyXvAAAAMAAJ&dq=editions:UOM39015026067572|accessdate=24 September 2012}}</ref>
If <math>X </math> and <math>Y </math> are two independent, continuous random variables, described by probability density functions <math>f_X </math> and <math>f_Y </math> then the [[Joint_distribution#Continuous_case|joint distribution]] of <math>Z = XY</math> is<ref>{{cite book|last=Rohatgi|first=V. K.|title=An Introduction to Probability Theory and Mathematical Statistics|year=1976|publisher=Wiley|location=New York|isbn=978-0471731351|url=http://books.google.com/books?id=YyXvAAAAMAAJ&dq=editions:UOM39015026067572|accessdate=24 September 2012}}</ref>


:<math>f_Z(z) = \int^{\infty}_{-\infty} f_X \left( x \right) f_Y \left( z/x \right) \frac{1}{|x|}\, dx. </math>
:<math>f_Z(z) = \int^{\infty}_{-\infty} f_X \left( x \right) f_Y \left( z/x \right) \frac{1}{|x|}\, dx. </math>

Revision as of 15:19, 17 July 2015

A product distribution is a probability distribution constructed as the distribution of the product of random variables having two other known distributions. Given two statistically independent random variables X and Y, the distribution of the random variable Z that is formed as the product

is a product distribution.

Algebra of random variables

The product is one type of algebra for random variables: Related to the product distribution are the ratio distribution, sum distribution (see List of convolutions of probability distributions) and difference distribution. More generally, one may talk of combinations of sums, differences, products and ratios.

Many of these distributions are described in Melvin D. Springer's book from 1979 The Algebra of Random Variables.[1]

Derivation for independent random variables

If and are two independent, continuous random variables, described by probability density functions and then the joint distribution of is[2]

Special cases

The distribution of the product of two random variables which have lognormal distributions is again lognormal. This is itself a special case of a more general set of results where the logarithm of the product can be written as the sum of the logarithms. Thus, in cases where a simple result can be found in the list of convolutions of probability distributions, where the distributions to be convolved are those of the logarithms of the components of the product, the result might be transformed to provide the distribution of the product. However this approach is only useful where the logarithms of the components of the product are in some standard families of distributions.

The distribution of the product of a random variable having a uniform distribution on (0,1) with a random variable having a gamma distribution with shape parameter equal to 2, is an exponential distribution.[3] A more general case of this concerns the distribution of the product of a random variable having a beta distribution with a random variable having a gamma distribution: for some cases where the parameters of the two component distributions are related in a certain way, the result is again a gamma distribution but with a changed shape parameter.[3]

The K-distribution is an example of a non-standard distribution that can be defined as a product distribution (where both components have a gamma distribution).

In theoretical computer science

In computational learning theory, a product distribution over is specified by the parameters . Each parameter gives the marginal probability that the ith bit of sampled as is 1; i.e. . In this setting, the uniform distribution is simply a product distribution with every .

Product distributions are a key tool used for proving learnability results when the examples cannot be assumed to be uniformly sampled.[4] They give rise to an inner product on the space of real-valued functions on as follows:

This inner product gives rise to a corresponding norm as follows:

See also

Notes

  1. ^ Springer, Melvin Dale (1979). The Algebra of Random Variables. Wiley. ISBN 0-471-01406-0. Retrieved 24 September 2012.
  2. ^ Rohatgi, V. K. (1976). An Introduction to Probability Theory and Mathematical Statistics. New York: Wiley. ISBN 978-0471731351. Retrieved 24 September 2012.
  3. ^ a b Johnson, Norman L.; Kotz, Samuel; Balakrishnan, N. (1995). Continuous Univariate Distributions Volume 2, Second edition. Wiley. p. 306. ISBN 0-471-58494-0. Retrieved 24 September 2012.
  4. ^ Servedio, Rocco A. (2004), "On learning monotone DNF under product distributions", Inf. Comput., 193 (1), Academic Press, Inc.: 57–74, doi:10.1016/j.ic.2004.04.003

References

  • Springer, Melvin Dale; Thompson, W. E. (1970). "The distribution of products of beta, gamma and Gaussian random variables". SIAM Journal on Applied Mathematics. 18 (4): 721–737. doi:10.1137/0118065. JSTOR 2099424.
  • Springer, Melvin Dale; Thompson, W. E. (1966). "The distribution of products of independent random variables". SIAM Journal on Applied Mathematics. 14 (3): 511–526. doi:10.1137/0114046. JSTOR 2946226.