In probability theory and statistics , the Rayleigh distribution is a continuous probability distribution for positive-valued random variables .
A Rayleigh distribution is often observed when the overall magnitude of a vector is related to its directional components . One example where the Rayleigh distribution naturally arises is when wind velocity is analyzed into its orthogonal 2-dimensional vector components. Assuming that each component is uncorrelated , normally distributed with equal variance , and zero mean , then the overall wind speed (vector magnitude) will be characterized by a Rayleigh distribution. A second example of the distribution arises in the case of random complex numbers whose real and imaginary components are independently and identically distributed Gaussian with equal variance and zero mean. In that case, the absolute value of the complex number is Rayleigh-distributed.
The distribution is named after Lord Rayleigh [1]
Definition [ edit ]
The probability density function of the Rayleigh distribution is[2]
f
(
x
;
σ
)
=
x
σ
2
e
−
x
2
/
(
2
σ
2
)
,
x
≥
0
,
{\displaystyle f(x;\sigma )={\frac {x}{\sigma ^{2}}}e^{-x^{2}/(2\sigma ^{2})},\quad x\geq 0,}
where
σ
{\displaystyle \sigma }
is the scale parameter of the distribution. The cumulative distribution function is[2]
F
(
x
;
σ
)
=
1
−
e
−
x
2
/
(
2
σ
2
)
{\displaystyle F(x;\sigma )=1-e^{-x^{2}/(2\sigma ^{2})}}
for
x
∈
[
0
,
∞
)
.
{\displaystyle x\in [0,\infty ).}
Relation to random vector length [ edit ]
Consider the two-dimensional vector
Y
=
(
U
,
V
)
{\displaystyle Y=(U,V)}
which has components that are Gaussian-distributed, centered at zero, and independent. Then
f
U
(
u
;
σ
)
=
e
−
u
2
/
(
2
σ
2
)
2
π
σ
2
{\displaystyle f_{U}(u;\sigma )={\frac {e^{-u^{2}/\left(2\sigma ^{2}\right)}}{\sqrt {2\pi \sigma ^{2}}}}}
, and similarly for
f
V
(
v
;
σ
)
{\displaystyle f_{V}(v;\sigma )}
.
Let
x
{\displaystyle x}
be the length of
Y
{\displaystyle Y}
. It is distributed as
f
(
x
;
σ
)
=
1
2
π
σ
2
∫
−
∞
∞
d
u
∫
−
∞
∞
d
v
e
−
u
2
/
(
2
σ
2
)
e
−
v
2
/
(
2
σ
2
)
δ
(
x
−
u
2
+
v
2
)
.
{\displaystyle f(x;\sigma )={\frac {1}{2\pi \sigma ^{2}}}\int _{-\infty }^{\infty }du\,\int _{-\infty }^{\infty }dv\,e^{-u^{2}/\left(2\sigma ^{2}\right)}e^{-v^{2}/\left(2\sigma ^{2}\right)}\delta (x-{\sqrt {u^{2}+v^{2}}}).}
By transforming to the polar coordinate system one has
f
(
x
;
σ
)
=
1
2
π
σ
2
∫
0
2
π
d
ϕ
∫
0
∞
d
r
δ
(
r
−
x
)
r
e
−
r
2
/
(
2
σ
2
)
=
x
σ
2
e
−
x
2
/
(
2
σ
2
)
,
{\displaystyle f(x;\sigma )={\frac {1}{2\pi \sigma ^{2}}}\int _{0}^{2\pi }\,d\phi \int _{0}^{\infty }dr\,\delta (r-x)re^{-r^{2}/\left(2\sigma ^{2}\right)}={\frac {x}{\sigma ^{2}}}e^{-x^{2}/\left(2\sigma ^{2}\right)},}
which is the Rayleigh distribution. It is straightforward to generalize to vectors of dimension other than 2. There are also generalizations when the components have unequal variance or correlations.
Properties [ edit ]
The raw moments are given by:
μ
k
=
σ
k
2
k
2
Γ
(
1
+
k
2
)
{\displaystyle \mu _{k}=\sigma ^{k}2^{\frac {k}{2}}\,\Gamma \left(1+{\frac {k}{2}}\right)}
where
Γ
(
z
)
{\displaystyle \Gamma (z)}
is the Gamma function .
The mean and variance of a Rayleigh random variable may be expressed as:
μ
(
X
)
=
σ
π
2
≈
1.253
σ
{\displaystyle \mu (X)=\sigma {\sqrt {\frac {\pi }{2}}}\ \approx 1.253\sigma }
and
var
(
X
)
=
4
−
π
2
σ
2
≈
0.429
σ
2
{\displaystyle {\textrm {var}}(X)={\frac {4-\pi }{2}}\sigma ^{2}\approx 0.429\sigma ^{2}}
The mode is
σ
{\displaystyle \sigma }
and the maximum pdf is
f
max
=
f
(
σ
;
σ
)
=
1
σ
e
−
1
2
≈
1
σ
0.606
{\displaystyle f_{\text{max}}=f(\sigma ;\sigma )={\frac {1}{\sigma }}e^{-{\frac {1}{2}}}\approx {\frac {1}{\sigma }}0.606}
The skewness is given by:
γ
1
=
2
π
(
π
−
3
)
(
4
−
π
)
3
2
≈
0.631
{\displaystyle \gamma _{1}={\frac {2{\sqrt {\pi }}(\pi -3)}{(4-\pi )^{\frac {3}{2}}}}\approx 0.631}
The excess kurtosis is given by:
γ
2
=
−
6
π
2
−
24
π
+
16
(
4
−
π
)
2
≈
0.245
{\displaystyle \gamma _{2}=-{\frac {6\pi ^{2}-24\pi +16}{(4-\pi )^{2}}}\approx 0.245}
The characteristic function is given by:
φ
(
t
)
=
1
−
σ
t
e
−
1
2
σ
2
t
2
π
2
[
erfi
(
σ
t
2
)
−
i
]
{\displaystyle \varphi (t)=1-\sigma te^{-{\frac {1}{2}}\sigma ^{2}t^{2}}{\sqrt {\frac {\pi }{2}}}\left[{\textrm {erfi}}\left({\frac {\sigma t}{\sqrt {2}}}\right)-i\right]}
where
erfi
(
z
)
{\displaystyle \operatorname {erfi} (z)}
is the imaginary error function . The moment generating function is given by
M
(
t
)
=
1
+
σ
t
e
1
2
σ
2
t
2
π
2
[
erf
(
σ
t
2
)
+
1
]
{\displaystyle M(t)=1+\sigma t\,e^{{\frac {1}{2}}\sigma ^{2}t^{2}}{\sqrt {\frac {\pi }{2}}}\left[{\textrm {erf}}\left({\frac {\sigma t}{\sqrt {2}}}\right)+1\right]}
where
erf
(
z
)
{\displaystyle \operatorname {erf} (z)}
is the error function .
Differential entropy [ edit ]
The differential entropy is given by[citation needed ]
H
=
1
+
ln
(
σ
2
)
+
γ
2
{\displaystyle H=1+\ln \left({\frac {\sigma }{\sqrt {2}}}\right)+{\frac {\gamma }{2}}}
where
γ
{\displaystyle \gamma }
is the Euler–Mascheroni constant .
Differential equation [ edit ]
The pdf of the Rayleigh distribution is a solution of the following differential equation :
{
σ
2
x
f
′
(
x
)
+
f
(
x
)
(
x
2
−
σ
2
)
=
0
f
(
1
)
=
exp
(
−
1
2
σ
2
)
σ
2
}
{\displaystyle \left\{{\begin{array}{l}\sigma ^{2}xf'(x)+f(x)\left(x^{2}-\sigma ^{2}\right)=0\\[10pt]f(1)={\frac {\exp \left(-{\frac {1}{2\sigma ^{2}}}\right)}{\sigma ^{2}}}\end{array}}\right\}}
Parameter estimation [ edit ]
Given a sample of N independent and identically distributed Rayleigh random variables
x
i
{\displaystyle x_{i}}
with parameter
σ
{\displaystyle \sigma }
,
σ
2
^
≈
1
2
N
∑
i
=
1
N
x
i
2
{\displaystyle {\widehat {\sigma ^{2}}}\approx \!\,{\frac {1}{2N}}\sum _{i=1}^{N}x_{i}^{2}}
is an unbiased maximum likelihood estimate.
σ
^
≈
1
2
N
∑
i
=
1
N
x
i
2
{\displaystyle {\hat {\sigma }}\approx \!\,{\sqrt {{\frac {1}{2N}}\sum _{i=1}^{N}x_{i}^{2}}}}
is a biased estimator that can be corrected via the formula
σ
=
σ
^
Γ
(
N
)
N
Γ
(
N
+
1
2
)
=
σ
^
4
N
N
!
(
N
−
1
)
!
N
(
2
N
)
!
π
{\displaystyle \sigma ={\hat {\sigma }}{\frac {\Gamma (N){\sqrt {N}}}{\Gamma (N+{\frac {1}{2}})}}={\hat {\sigma }}{\frac {4^{N}N!(N-1)!{\sqrt {N}}}{(2N)!{\sqrt {\pi }}}}}
[3]
Confidence intervals [ edit ]
To find the (1 − α ) confidence interval, first find the two numbers
χ
1
2
,
χ
2
2
{\displaystyle \chi _{1}^{2},\ \chi _{2}^{2}}
where:
P
r
(
χ
2
(
2
N
)
≤
χ
1
2
)
=
α
/
2
,
P
r
(
χ
2
(
2
N
)
≤
χ
2
2
)
=
1
−
α
/
2
{\displaystyle Pr(\chi ^{2}(2N)\leq \chi _{1}^{2})=\alpha /2,\quad Pr(\chi ^{2}(2N)\leq \chi _{2}^{2})=1-\alpha /2}
then
N
x
2
¯
χ
2
2
≤
σ
^
2
≤
N
x
2
¯
χ
1
2
{\displaystyle {\frac {N{\overline {x^{2}}}}{\chi _{2}^{2}}}\leq {\widehat {\sigma }}^{2}\leq {\frac {N{\overline {x^{2}}}}{\chi _{1}^{2}}}}
[4]
Generating random variates [ edit ]
Given a random variate U drawn from the uniform distribution in the interval (0, 1), then the variate
X
=
σ
−
2
ln
(
1
−
U
)
{\displaystyle X=\sigma {\sqrt {-2\ln(1-U)}}\,}
has a Rayleigh distribution with parameter
σ
{\displaystyle \sigma }
. This is obtained by applying the inverse transform sampling -method.
Related distributions [ edit ]
R
∼
R
a
y
l
e
i
g
h
(
σ
)
{\displaystyle R\sim \mathrm {Rayleigh} (\sigma )}
is Rayleigh distributed if
R
=
X
2
+
Y
2
{\displaystyle R={\sqrt {X^{2}+Y^{2}}}}
, where
X
∼
N
(
0
,
σ
2
)
{\displaystyle X\sim N(0,\sigma ^{2})}
and
Y
∼
N
(
0
,
σ
2
)
{\displaystyle Y\sim N(0,\sigma ^{2})}
are independent normal random variables .[5] (This gives motivation to the use of the symbol "sigma" in the above parametrization of the Rayleigh density.)
The chi distribution with v = 2 is equivalent to the Rayleigh Distribution with σ = 1. I.e., if
R
∼
R
a
y
l
e
i
g
h
(
1
)
{\displaystyle R\sim \mathrm {Rayleigh} (1)}
, then
R
2
{\displaystyle R^{2}}
has a chi-squared distribution with parameter
N
{\displaystyle N}
, degrees of freedom, equal to two (N = 2)
[
Q
=
R
2
]
∼
χ
2
(
N
)
.
{\displaystyle [Q=R^{2}]\sim \chi ^{2}(N)\ .}
If
R
∼
R
a
y
l
e
i
g
h
(
σ
)
{\displaystyle R\sim \mathrm {Rayleigh} (\sigma )}
, then
∑
i
=
1
N
R
i
2
{\displaystyle \sum _{i=1}^{N}R_{i}^{2}}
has a gamma distribution with parameters
N
{\displaystyle N}
and
2
σ
2
{\displaystyle 2\sigma ^{2}}
[
Y
=
∑
i
=
1
N
R
i
2
]
∼
Γ
(
N
,
2
σ
2
)
.
{\displaystyle \left[Y=\sum _{i=1}^{N}R_{i}^{2}\right]\sim \Gamma (N,2\sigma ^{2}).}
The Rice distribution is a generalization of the Rayleigh distribution.
The Weibull distribution is a generalization of the Rayleigh distribution. In this instance, parameter
σ
{\displaystyle \sigma }
is related to the Weibull scale parameter
λ
{\displaystyle \lambda }
:
λ
=
σ
2
.
{\displaystyle \lambda =\sigma {\sqrt {2}}.}
The Maxwell–Boltzmann distribution describes the magnitude of a normal vector in three dimensions.
If
X
{\displaystyle X}
has an exponential distribution
X
∼
E
x
p
o
n
e
n
t
i
a
l
(
λ
)
{\displaystyle X\sim \mathrm {Exponential} (\lambda )}
, then
Y
=
X
∼
R
a
y
l
e
i
g
h
(
1
/
2
λ
)
.
{\displaystyle Y={\sqrt {X}}\sim \mathrm {Rayleigh} (1/{\sqrt {2\lambda }}).}
Applications [ edit ]
An application of the estimation of σ can be found in magnetic resonance imaging (MRI). As MRI images are recorded as complex images but most often viewed as magnitude images, the background data is Rayleigh distributed. Hence, the above formula can be used to estimate the noise variance in an MRI image from background data.[6] [7]
Proof of correctness – Unequal variances [ edit ]
Click [show] to expand
We start with
f
(
x
;
σ
)
=
1
2
π
σ
1
σ
2
∫
−
∞
∞
d
u
∫
−
∞
∞
d
v
e
−
u
2
/
(
2
σ
1
2
)
e
−
v
2
/
(
2
σ
2
2
)
δ
(
x
−
u
2
+
v
2
)
.
{\displaystyle f(x;\sigma )={\frac {1}{2\pi \sigma _{1}\sigma _{2}}}\int _{-\infty }^{\infty }du\,\int _{-\infty }^{\infty }dv\,e^{-u^{2}/\left(2\sigma _{1}^{2}\right)}e^{-v^{2}/\left(2\sigma _{2}^{2}\right)}\delta (x-{\sqrt {u^{2}+v^{2}}}).}
as above, except with
σ
1
{\displaystyle \sigma _{1}}
and
σ
2
{\displaystyle \sigma _{2}}
distinct.
Let
a
=
u
σ
2
/
σ
1
{\displaystyle a=u\sigma _{2}/\sigma _{1}}
so that
a
/
σ
2
=
u
/
σ
1
{\displaystyle a/\sigma _{2}=u/\sigma _{1}}
, and differentiating we have:
σ
1
σ
2
d
a
=
d
u
{\displaystyle {\frac {\sigma _{1}}{\sigma _{2}}}\mathrm {d} a=\mathrm {d} u}
Substituting,
f
(
x
;
σ
)
=
σ
1
σ
2
1
2
π
σ
1
σ
2
∫
−
∞
∞
d
a
∫
−
∞
∞
d
v
e
−
a
2
/
(
2
σ
2
2
)
e
−
v
2
/
(
2
σ
2
2
)
δ
(
x
−
(
σ
1
σ
2
a
)
2
+
v
2
)
{\displaystyle f(x;\sigma )={\frac {\sigma _{1}}{\sigma _{2}}}{\frac {1}{2\pi \sigma _{1}\sigma _{2}}}\int _{-\infty }^{\infty }da\,\int _{-\infty }^{\infty }dv\,e^{-a^{2}/\left(2\sigma _{2}^{2}\right)}e^{-v^{2}/\left(2\sigma _{2}^{2}\right)}\delta \left(x-{\sqrt {\left({\frac {\sigma _{1}}{\sigma _{2}}}a\right)^{2}+v^{2}}}\right)}
As before, we perform a polar coordinate transformation:[8]
{
a
=
r
cos
(
ϕ
)
v
=
r
sin
(
ϕ
)
d
a
d
v
=
r
d
r
d
ϕ
{\displaystyle \left\{{\begin{array}{l}a=r{\textrm {cos}}(\phi )\\v=r{\textrm {sin}}(\phi )\\{\textrm {d}}a{\textrm {d}}v=r{\textrm {d}}r{\textrm {d}}\phi \end{array}}\right.}
Substituting,
f
(
x
;
σ
)
=
σ
1
σ
2
1
2
π
σ
1
σ
2
∫
0
2
π
d
ϕ
∫
0
∞
r
d
r
e
−
r
2
/
(
2
σ
2
2
)
δ
(
x
−
(
σ
1
σ
2
a
)
2
+
v
2
)
.
{\displaystyle f(x;\sigma )={\frac {\sigma _{1}}{\sigma _{2}}}{\frac {1}{2\pi \sigma _{1}\sigma _{2}}}\int _{0}^{2\pi }{\textrm {d}}\phi \,\int _{0}^{\infty }r{\textrm {d}}r\,e^{-r^{2}/\left(2\sigma _{2}^{2}\right)}\delta \left(x-{\sqrt {\left({\frac {\sigma _{1}}{\sigma _{2}}}a\right)^{2}+v^{2}}}\right).}
Simplifying,
f
(
x
;
σ
)
=
1
2
π
σ
2
2
∫
0
2
π
d
ϕ
∫
0
∞
r
d
r
e
−
r
2
/
(
2
σ
2
2
)
δ
(
x
−
(
σ
1
σ
2
a
)
2
+
v
2
)
.
{\displaystyle f(x;\sigma )={\frac {1}{2\pi \sigma _{2}^{2}}}\int _{0}^{2\pi }{\textrm {d}}\phi \,\int _{0}^{\infty }r{\textrm {d}}r\,e^{-r^{2}/\left(2\sigma _{2}^{2}\right)}\delta \left(x-{\sqrt {\left({\frac {\sigma _{1}}{\sigma _{2}}}a\right)^{2}+v^{2}}}\right).}
See Hoyt distribution for more information.
See also [ edit ]
References [ edit ]
^ "The Wave Theory of Light", Encyclopedic Britannica 1888; "The Problem of the Random Walk", Nature 1905 vol.72 p.318
^ a b Papoulis, Athanasios; Pillai, S. (2001) Probability, Random Variables and Stochastic Processes . ISBN 0073660116 , ISBN 9780073660110 [page needed ]
^ Siddiqui, M. M. (1964) "Statistical inference for Rayleigh distributions", The Journal of Research of the National Bureau of Standards, Sec. D: Radio Science , Vol. 68D, No. 9, p. 1007
^ Siddiqui, M. M. (1961) "Some Problems Connected With Rayleigh Distributions", The Journal of Research of the National Bureau of Standards, Sec. D: Radio Propagation , Vol. 66D, No. 2, p. 169
^ Hogema, Jeroen (2005) "Shot group statistics"
^ Sijbers J., den Dekker A. J., Raman E. and Van Dyck D. (1999) "Parameter estimation from magnitude MR images", International Journal of Imaging Systems and Technology , 10(2), 109–114
^ den Dekker A. J., Sijbers J., (2014) "Data distributions in magnetic resonance images: a review", Physica Medica , 30(7), 725–741[1]
^ http://physicspages.com/2012/12/24/coordinate-transformations-the-jacobian-determinant/
Discrete univariate
with finite support
Discrete univariate
with infinite support
Continuous univariate
supported on a bounded interval
Continuous univariate
supported on a semi-infinite interval
Continuous univariate
supported on the whole real line
Continuous univariate
with support whose type varies
Mixed continuous-discrete univariate
Multivariate (joint)
Directional
Degenerate and singular
Families