Inverse distribution

In probability theory and statistics, an inverse distribution is the distribution of the reciprocal of a random variable. Inverse distributions arise in particular in the Bayesian context of prior distributions and posterior distributions for scale parameters. In the algebra of random variables, inverse distributions are special cases of the class of ratio distributions, in which the numerator random variable has a degenerate distribution.

Relation to original distribution

In general, given the probability distribution of a random variable X with strictly positive support, it is possible to find the distribution of the reciprocal, Y = 1 / X. If the distribution of X is continuous with density function f(x) and cumulative distribution function F(x), then the cumulative distribution function, G(y), of the reciprocal is found by noting that

${\displaystyle G(y)=\Pr(Y\leq y)=\Pr \left(X\geq {\frac {1}{y}}\right)=1-\Pr \left(X<{\frac {1}{y}}\right)=1-F\left({\frac {1}{y}}\right).}$

Then the density function of Y is found as the derivative of the cumulative distribution function:

${\displaystyle g(y)={\frac {1}{y^{2}}}f\left({\frac {1}{y}}\right).}$

Examples

Reciprocal distribution

The reciprocal distribution has a density function of the form.[1]

${\displaystyle f(x)\propto x^{-1}\quad {\text{ for }}0

where ${\displaystyle \propto \!\,}$ means "is proportional to". It follows that the inverse distribution in this case is of the form

${\displaystyle g(y)\propto y^{-1}\quad {\text{ for }}0\leq b^{-1}

which is again a reciprocal distribution.

Inverse uniform distribution

Parameters ${\displaystyle 0 ${\displaystyle [b^{-1},a^{-1}]}$ ${\displaystyle y^{-2}{\frac {1}{b-a}}}$ ${\displaystyle {\frac {b-y^{-1}}{b-a}}}$ ${\displaystyle {\frac {\ln({\frac {1}{a}})-\ln({\frac {1}{b}})}{b-a}}}$ ${\displaystyle {\frac {2}{a+b}}}$ ${\displaystyle {\frac {1}{a\cdot b}}-\left({\frac {\ln({\frac {1}{a}})-\ln({\frac {1}{b}})}{b-a}}\right)^{2}}$

If the original random variable X is uniformly distributed on the interval (a,b), where a>0, then the reciprocal variable Y = 1 / X has the reciprocal distribution which takes values in the range (b−1 ,a−1), and the probability density function in this range is

${\displaystyle g(y)=y^{-2}{\frac {1}{b-a}},}$

and is zero elsewhere.

The cumulative distribution function of the reciprocal, within the same range, is

${\displaystyle G(y)={\frac {b-y^{-1}}{b-a}}.}$

Inverse t distribution

Let X be a t distributed random variate with k degrees of freedom. Then its density function is

${\displaystyle f(x)={\frac {1}{\sqrt {k\pi }}}{\frac {\Gamma \left({\frac {k+1}{2}}\right)}{\Gamma \left({\frac {k}{2}}\right)}}{\frac {1}{\left(1+{\frac {x^{2}}{k}}\right)^{\frac {1+k}{2}}}}.}$

The density of Y = 1 / X is

${\displaystyle g(y)={\frac {1}{\sqrt {k\pi }}}{\frac {\Gamma \left({\frac {k+1}{2}}\right)}{\Gamma \left({\frac {k}{2}}\right)}}{\frac {1}{y^{2}\left(1+{\frac {1}{y^{2}k}}\right)^{\frac {1+k}{2}}}}.}$

With k = 1, the distributions of X and 1 / X are identical (X is then Cauchy distributed (0,1)). If k > 1 then the distribution of 1 / X is bimodal.[citation needed]

Reciprocal normal distribution

If X is a standard normally distributed variable then the distribution of 1/X is bimodal,[2] and the first and higher-order moments do not exist.[2]

Inverse exponential distribution

If ${\displaystyle X}$ is an exponentially distributed random variable with rate parameter ${\displaystyle \lambda }$, then ${\displaystyle Y=1/X}$ has the following cumulative distribution function: ${\displaystyle F_{Y}(y)=e^{-\lambda /y}}$for ${\displaystyle y>0}$. Note that the expected value of this random variable does not exist. The reciprocal exponential distribution finds use in the analysis of fading wireless communication systems.

Inverse Cauchy distribution

If X is a Cauchy distributed (μ, σ) random variable, then 1 / X is a Cauchy ( μ / C, σ / C ) random variable where C = μ2 + σ2.

Inverse F distribution

If X is an F(ν1, ν2 ) distributed random variable then 1 / X is an F(ν2, ν1 ) random variable.

Reciprocal of binomial distribution

No closed form for this distribution is known. An asymptotic approximation for the mean is known.[3]

${\displaystyle E[(1+X)^{a}]=O((np)^{-a})+o(n^{-a})}$

where E[] is the expectation operator, X is a random variable, O() and o() are the big and little o order functions, n is the sample size, p is the probability of success and a is a variable that may be positive or negative, integer or fractional.

Other inverse distributions

Other inverse distributions include

inverse-chi-squared distribution
inverse-gamma distribution
inverse-Wishart distribution
inverse matrix gamma distribution
inverse beta prime distribution: defined in the beta distribution article.

Applications

Inverse distributions are widely used as prior distributions in Bayesian inference for scale parameters.