Wrapped normal distribution

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Wrapped Normal
Probability density function
Plot of the von Mises PMF
The support is chosen to be [-π,π] with μ=0
Cumulative distribution function
Plot of the von Mises CMF
The support is chosen to be [-π,π] with μ=0
Parameters \mu real
\sigma>0
Support \theta \in any interval of length 2π
pdf \frac{1}{2\pi}\vartheta\left(\frac{\theta-\mu}{2\pi},\frac{i\sigma^2}{2\pi}\right)
Mean \mu
Median \mu
Mode \mu
Variance 1-e^{-\sigma^2/2} (circular)
Entropy (see text)
CF e^{-\sigma^2n^2/2+in\mu}

In probability theory and directional statistics, a wrapped normal distribution is a wrapped probability distribution that results from the "wrapping" of the normal distribution around the unit circle. It finds application in the theory of Brownian motion and is a solution to the heat equation for periodic boundary conditions. It is closely approximated by the von Mises distribution, which, due to its mathematical simplicity and tractability, is the most commonly used distribution in directional statistics.

Definition[edit]

The probability density function of the wrapped normal distribution is[1]


f_{WN}(\theta;\mu,\sigma)=\frac{1}{\sigma \sqrt{2\pi}} \sum^{\infty}_{k=-\infty} \exp \left[\frac{-(\theta - \mu + 2\pi k)^2}{2 \sigma^2} \right]

where μ and σ are the mean and standard deviation of the unwrapped distribution, respectively. Expressing the above density function in terms of the characteristic function of the normal distribution yields:[1]


f_{WN}(\theta;\mu,\sigma)=\frac{1}{2\pi}\sum_{n=-\infty}^\infty e^{-\sigma^2n^2/2+in(\theta-\mu)} =\frac{1}{2\pi}\vartheta\left(\frac{\theta-\mu}{2\pi},\frac{i\sigma^2}{2\pi}\right) ,

where \vartheta(\theta,\tau) is the Jacobi theta function, given by


\vartheta(\theta,\tau)=\sum_{n=-\infty}^\infty (w^2)^n q^{n^2}
 \text{ where } w \equiv e^{i\pi \theta} and q \equiv e^{i\pi\tau} .

The wrapped normal distribution may also be expressed in terms of the Jacobi triple product:[2]

f_{WN}(\theta;\mu,\sigma)=\frac{1}{2\pi}\prod_{n=1}^\infty (1-q^n)(1+q^{n-1/2}z)(1+q^{n-1/2}/z) .

where z=e^{i(\theta-\mu)}\, and q=e^{-\sigma^2}.

Moments[edit]

In terms of the circular variable z=e^{i\theta} the circular moments of the wrapped Normal distribution are the characteristic function of the Normal distribution evaluated at integer arguments:

\langle z^n\rangle=\int_\Gamma e^{in\theta}\,f_{WN}(\theta;\mu,\sigma)\,d\theta = e^{i n \mu-n^2\sigma^2/2}.

where \Gamma\, is some interval of length 2\pi. The first moment is then the average value of z, also known as the mean resultant, or mean resultant vector:


\langle z \rangle=e^{i\mu-\sigma^2/2}

The mean angle is


\theta_\mu=\mathrm{Arg}\langle z \rangle = \mu

and the length of the mean resultant is


R=|\langle z \rangle| = e^{-\sigma^2/2}

The circular standard deviation, which is a useful measure of dispersion for the wrapped Normal distribution and its close relative, the von Mises distribution is given by:


s=\sqrt{\ln(1/R^2)} = \sigma

Estimation of parameters[edit]

A series of N measurements zn = e n drawn from a wrapped normal distribution may be used to estimate certain parameters of the distribution. The average of the series z is defined as

\overline{z}=\frac{1}{N}\sum_{n=1}^N z_n

and its expectation value will be just the first moment:

\langle\overline{z}\rangle=e^{i\mu-\sigma^2/2}. \,

In other words, z is an unbiased estimator of the first moment. If we assume that the mean μ lies in the interval [−ππ), then Arg z will be a (biased) estimator of the mean μ.

Viewing the zn as a set of vectors in the complex plane, the R2 statistic is the square of the length of the averaged vector:

\overline{R}^2=\overline{z}\,\overline{z^*}=\left(\frac{1}{N}\sum_{n=1}^N \cos\theta_n\right)^2+\left(\frac{1}{N}\sum_{n=1}^N \sin\theta_n\right)^2 \,

and its expected value is:

\left\langle \overline{R}^2\right\rangle = \frac{1}{N}+\frac{N-1}{N}\,e^{-\sigma^2}\,

In other words, the statistic

R_e^2=\frac{N}{N-1}\left(\overline{R}^2-\frac{1}{N}\right)

will be an unbiased estimator of eσ2, and ln(1/Re2) will be a (biased) estimator of σ2

Entropy[edit]

The information entropy of the wrapped normal distribution is defined as:[1]

H = -\int_\Gamma f_{WN}(\theta;\mu,\sigma)\,\ln(f_{WN}(\theta;\mu,\sigma))\,d\theta

where \Gamma is any interval of length 2\pi. Defining z=e^{i(\theta-\mu)} and q=e^{-\sigma^2}, the Jacobi triple product representation for the wrapped normal is:

f_{WN}(\theta;\mu,\sigma) = \frac{\phi(q)}{2\pi}\prod_{m=1}^\infty (1+q^{m-1/2}z)(1+q^{m-1/2}z^{-1})

where \phi(q)\, is the Euler function. The logarithm of the density of the wrapped normal distribution may be written:

\ln(f_{WN}(\theta;\mu,\sigma))=  \ln\left(\frac{\phi(q)}{2\pi}\right)+\sum_{m=1}^\infty\ln(1+q^{m-1/2}z)+\sum_{m=1}^\infty\ln(1+q^{m-1/2}z^{-1})

Using the series expansion for the logarithm:

\ln(1+x)=-\sum_{k=1}^\infty \frac{(-1)^k}{k}\,x^k

the logarithmic sums may be written as:

\sum_{m=1}^\infty\ln(1+q^{m-1/2}z^{\pm 1})=-\sum_{m=1}^\infty \sum_{k=1}^\infty \frac{(-1)^k}{k}\,q^{mk-k/2}z^{\pm k} = -\sum_{k=1}^\infty \frac{(-1)^k}{k}\,\frac{q^{k/2}}{1-q^k}\,z^{\pm k}

so that the logarithm of density of the wrapped normal distribution may be written as:

\ln(f_{WN}(\theta;\mu,\sigma))=\ln\left(\frac{\phi(q)}{2\pi}\right)-\sum_{k=1}^\infty \frac{(-1)^k}{k} \frac{q^{k/2}}{1-q^k}\,(z^k+z^{-k})

which is essentially a Fourier series in \theta\,. Using the characteristic function representation for the wrapped normal distribution in the left side of the integral:

f_{WN}(\theta;\mu,\sigma) =\frac{1}{2\pi}\sum_{n=-\infty}^\infty q^{n^2/2}\,z^n

the entropy may be written:

H = -\ln\left(\frac{\phi(q)}{2\pi}\right)+\frac{1}{2\pi}\int_\Gamma \left( \sum_{n=-\infty}^\infty\sum_{k=1}^\infty \frac{(-1)^k}{k} \frac{q^{(n^2+k)/2}}{1-q^k}\left(z^{n+k}+z^{n-k}\right) \right)\,d\theta

which may be integrated to yield:

H = -\ln\left(\frac{\phi(q)}{2\pi}\right)+2\sum_{k=1}^\infty \frac{(-1)^k}{k}\, \frac{q^{(k^2+k)/2}}{1-q^k}

See also[edit]

References[edit]

  1. ^ a b c Mardia, Kantilal; Jupp, Peter E. (1999). Directional Statistics. Wiley. ISBN 978-0-471-95333-3. Retrieved 2011-07-19. 
  2. ^ Whittaker, E. T.; Watson, G. N. (2009). A Course of Modern Analysis. Book Jungle. ISBN 978-1-4385-2815-1. 

External links[edit]