Zeta distribution

From Wikipedia, the free encyclopedia
Jump to: navigation, search
zeta
Probability mass function
Plot of the Zeta PMF
Plot of the Zeta PMF on a log-log scale. (Note that the function is only defined at integer values of k. The connecting lines do not indicate continuity.)
Cumulative distribution function
Plot of the Zeta CMF
Parameters s\in(1,\infty)
Support k \in \{1,2,\ldots\}
pmf \frac{1/k^s}{\zeta(s)}
CDF \frac{H_{k,s}}{\zeta(s)}
Mean \frac{\zeta(s-1)}{\zeta(s)}~\textrm{for}~s>2
Mode 1\,
Variance \frac{\zeta(s)\zeta(s-2) - \zeta(s-1)^2}{\zeta(s)^2}~\textrm{for}~s>3
Entropy \sum_{k=1}^\infty\frac{1/k^s}{\zeta(s)}\log (k^s \zeta(s)).\,\!
MGF \frac{\operatorname{Li}_s(e^t)}{\zeta(s)}
CF \frac{\operatorname{Li}_s(e^{it})}{\zeta(s)}

In probability theory and statistics, the zeta distribution is a discrete probability distribution. If X is a zeta-distributed random variable with parameter s, then the probability that X takes the integer value k is given by the probability mass function

f_s(k)=k^{-s}/\zeta(s)\,

where ζ(s) is the Riemann zeta function (which is undefined for s = 1).

The multiplicities of distinct prime factors of X are independent random variables.

The Riemann zeta function being the sum of all term k^{-s} for integer k, it appears thus as the normalization of the Zipf distribution. Indeed the terms "Zipf distribution" and the "zeta distribution" are often used interchangeably. But note that while the Zeta distribution is a probability distribution by itself, it is not associated to the Zipf's law with same exponent. See also Yule–Simon distribution

Moments[edit]

The nth raw moment is defined as the expected value of Xn:

m_n = E(X^n) = \frac{1}{\zeta(s)}\sum_{k=1}^\infty \frac{1}{k^{s-n}}

The series on the right is just a series representation of the Riemann zeta function, but it only converges for values of s-n that are greater than unity. Thus:

m_n =\left\{
\begin{matrix}
\zeta(s-n)/\zeta(s) & \textrm{for}~n < s-1 \\
\infty & \textrm{for}~n \ge s-1
\end{matrix}
\right.

Note that the ratio of the zeta functions is well defined, even for n ≥ s − 1 because the series representation of the zeta function can be analytically continued. This does not change the fact that the moments are specified by the series itself, and are therefore undefined for large n.

Moment generating function[edit]

The moment generating function is defined as

M(t;s) = E(e^{tX}) = \frac{1}{\zeta(s)} \sum_{k=1}^\infty \frac{e^{tk}}{k^s}.

The series is just the definition of the polylogarithm, valid for e^t<1 so that

M(t;s) = \frac{\operatorname{Li}_s(e^t)}{\zeta(s)}\text{ for }t<0.

The Taylor series expansion of this function will not necessarily yield the moments of the distribution. The Taylor series using the moments as they usually occur in the moment generating function yields

\sum_{n=0}^\infty \frac{m_n t^n}{n!},

which obviously is not well defined for any finite value of s since the moments become infinite for large n. If we use the analytically continued terms instead of the moments themselves, we obtain from a series representation of the polylogarithm

\frac{1}{\zeta(s)}\sum_{n=0,n\ne s-1}^\infty \frac{\zeta(s-n)}{n!}\,t^n=\frac{\operatorname{Li}_s(e^t)-\Phi(s,t)}{\zeta(s)}

for \scriptstyle |t|\,<\,2\pi. \scriptstyle\Phi(s,t) is given by

\Phi(s,t)=\Gamma(1-s)(-t)^{s-1}\text{ for }s\ne 1,2,3\ldots
\Phi(s,t)=\frac{t^{s-1}}{(s-1)!}\left[H_s-\ln(-t)\right]\text{ for }s=2,3,4\ldots
\Phi(s,t)=-\ln(-t)\text{ for }s=1,\,

where Hs is a harmonic number.

The case s = 1[edit]

ζ(1) is infinite as the harmonic series, and so the case when s = 1 is not meaningful. However, if A is any set of positive integers that has a density, i.e. if

\lim_{n\rightarrow\infty}\frac{N(A,n)}{n}

exists where N(An) is the number of members of A less than or equal to n, then

\lim_{s\rightarrow 1+}P(X\in A)\,

is equal to that density.

The latter limit can also exist in some cases in which A does not have a density. For example, if A is the set of all positive integers whose first digit is d, then A has no density, but nonetheless the second limit given above exists and is proportional to

\log(d+1) - \log(d),\,

similar to Benford's law.

See also[edit]

Other "power-law" distributions

External links[edit]