Inverse distribution

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Not to be confused with Inverse distribution function.

In probability theory and statistics, an inverse distribution is the distribution of the reciprocal of a random variable. Inverse distributions arise in particular in the Bayesian context of prior distributions and posterior distributions for scale parameters. In the algebra of random variables, inverse distributions are special cases of the class of ratio distributions, in which the numerator random variable has a degenerate distribution.

Relation to original distribution[edit]

In general, given the probability distribution of a random variable X with strictly positive support, it is possible to find the distribution of the reciprocal, Y = 1 / X. If the distribution of X is continuous with density function f(x) and cumulative distribution function F(x), then the cumulative distribution function, G(y), of the reciprocal is found by noting that

Then the density function of Y is found as the derivative of the cumulative distribution function:

Examples[edit]

Reciprocal distribution[edit]

The reciprocal distribution has a density function of the form.[1]

where means "is proportional to". It follows that the inverse distribution in this case is of the form

which is again a reciprocal distribution.

Inverse uniform distribution[edit]

Inverse uniform distribution
Parameters
Support
PDF
CDF
Mean
Median
Variance

If the original random variable X is uniformly distributed on the interval (a,b), where a>0, then the reciprocal variable Y = 1 / X has the reciprocal distribution which takes values in the range (b−1 ,a−1), and the probability density function in this range is

and is zero elsewhere.

The cumulative distribution function of the reciprocal, within the same range, is

Inverse t distribution[edit]

Let X be a t distributed random variate with k degrees of freedom. Then its density function is

The density of Y = 1 / X is

With k = 1, the distributions of X and 1 / X are identical. If k > 1 then the distribution of 1 / X is bimodal.[citation needed]

Reciprocal normal distribution[edit]

If X is a standard normally distributed variable then the distribution of 1/X is bimodal,[2] and the first and higher-order moments do not exist.[2]

Inverse exponential distribution[edit]

If is an exponentially distributed random variable with rate parameter , then has the following cumulative distribution function: for . Note that the expected value of this random variable does not exist. The reciprocal exponential distribution finds use in the analysis of fading wireless communication systems.

Inverse Cauchy distribution[edit]

If X is a Cauchy distributed (μ, σ) random variable, then 1 / X is a Cauchy ( μ / C, σ / C ) random variable where C = μ2 + σ2.

Inverse F distribution[edit]

If X is an F(ν1, ν2 ) distributed random variable then 1 / X is an F(ν2, ν1 ) random variable.

Other inverse distributions[edit]

Other inverse distributions include the inverse-chi-squared distribution, the inverse-gamma distribution, the inverse-Wishart distribution, and the inverse matrix gamma distribution.

Applications[edit]

Inverse distributions are widely used as prior distributions in Bayesian inference for scale parameters.

See also[edit]

References[edit]

  1. ^ Hamming R. W. (1970) "On the distribution of numbers", The Bell System Technical Journal 49(8) 1609–1625
  2. ^ a b Johnson, Norman L.; Kotz, Samuel; Balakrishnan, Narayanaswamy (1994). Continuous Univariate Distributions, Volume 1. Wiley. p. 171. ISBN 0-471-58495-9.  (this is a special case of the generalized inverse normal distribution treated)