# Zeta distribution

Parameters Probability mass function Plot of the Zeta PMF on a log-log scale. (The function is only defined at integer values of k. The connecting lines do not indicate continuity.) Cumulative distribution function ${\displaystyle s\in (1,\infty )}$ ${\displaystyle k\in \{1,2,\ldots \}}$ ${\displaystyle {\frac {1/k^{s}}{\zeta (s)}}}$ ${\displaystyle {\frac {H_{k,s}}{\zeta (s)}}}$ ${\displaystyle {\frac {\zeta (s-1)}{\zeta (s)}}~{\textrm {for}}~s>2}$ ${\displaystyle 1\,}$ ${\displaystyle {\frac {\zeta (s)\zeta (s-2)-\zeta (s-1)^{2}}{\zeta (s)^{2}}}~{\textrm {for}}~s>3}$ ${\displaystyle \sum _{k=1}^{\infty }{\frac {1/k^{s}}{\zeta (s)}}\log(k^{s}\zeta (s)).\,\!}$ ${\displaystyle {\frac {\operatorname {Li} _{s}(e^{t})}{\zeta (s)}}}$ ${\displaystyle {\frac {\operatorname {Li} _{s}(e^{it})}{\zeta (s)}}}$

In probability theory and statistics, the zeta distribution is a discrete probability distribution. If X is a zeta-distributed random variable with parameter s, then the probability that X takes the integer value k is given by the probability mass function

${\displaystyle f_{s}(k)=k^{-s}/\zeta (s)\,}$

where ζ(s) is the Riemann zeta function (which is undefined for s = 1).

The multiplicities of distinct prime factors of X are independent random variables.

The Riemann zeta function being the sum of all terms ${\displaystyle k^{-s}}$ for positive integer k, it appears thus as the normalization of the Zipf distribution. Indeed the terms "Zipf distribution" and the "zeta distribution" are often used interchangeably. But note that while the Zeta distribution is a probability distribution by itself, it is not associated to the Zipf's law with same exponent. See also Yule–Simon distribution

## Moments

The nth raw moment is defined as the expected value of Xn:

${\displaystyle m_{n}=E(X^{n})={\frac {1}{\zeta (s)}}\sum _{k=1}^{\infty }{\frac {1}{k^{s-n}}}}$

The series on the right is just a series representation of the Riemann zeta function, but it only converges for values of s-n that are greater than unity. Thus:

${\displaystyle m_{n}=\left\{{\begin{matrix}\zeta (s-n)/\zeta (s)&{\textrm {for}}~n

Note that the ratio of the zeta functions is well defined, even for n ≥ s − 1 because the series representation of the zeta function can be analytically continued. This does not change the fact that the moments are specified by the series itself, and are therefore undefined for large n.

### Moment generating function

The moment generating function is defined as

${\displaystyle M(t;s)=E(e^{tX})={\frac {1}{\zeta (s)}}\sum _{k=1}^{\infty }{\frac {e^{tk}}{k^{s}}}.}$

The series is just the definition of the polylogarithm, valid for ${\displaystyle e^{t}<1}$ so that

${\displaystyle M(t;s)={\frac {\operatorname {Li} _{s}(e^{t})}{\zeta (s)}}{\text{ for }}t<0.}$

The Taylor series expansion of this function will not necessarily yield the moments of the distribution. The Taylor series using the moments as they usually occur in the moment generating function yields

${\displaystyle \sum _{n=0}^{\infty }{\frac {m_{n}t^{n}}{n!}},}$

which obviously is not well defined for any finite value of s since the moments become infinite for large n. If we use the analytically continued terms instead of the moments themselves, we obtain from a series representation of the polylogarithm

${\displaystyle {\frac {1}{\zeta (s)}}\sum _{n=0,n\neq s-1}^{\infty }{\frac {\zeta (s-n)}{n!}}\,t^{n}={\frac {\operatorname {Li} _{s}(e^{t})-\Phi (s,t)}{\zeta (s)}}}$

for ${\displaystyle \scriptstyle |t|\,<\,2\pi }$. ${\displaystyle \scriptstyle \Phi (s,t)}$ is given by

${\displaystyle \Phi (s,t)=\Gamma (1-s)(-t)^{s-1}{\text{ for }}s\neq 1,2,3\ldots }$
${\displaystyle \Phi (s,t)={\frac {t^{s-1}}{(s-1)!}}\left[H_{s}-\ln(-t)\right]{\text{ for }}s=2,3,4\ldots }$
${\displaystyle \Phi (s,t)=-\ln(-t){\text{ for }}s=1,\,}$

where Hs is a harmonic number.

## The case s = 1

ζ(1) is infinite as the harmonic series, and so the case when s = 1 is not meaningful. However, if A is any set of positive integers that has a density, i.e. if

${\displaystyle \lim _{n\rightarrow \infty }{\frac {N(A,n)}{n}}}$

exists where N(An) is the number of members of A less than or equal to n, then

${\displaystyle \lim _{s\rightarrow 1+}P(X\in A)\,}$

is equal to that density.

The latter limit can also exist in some cases in which A does not have a density. For example, if A is the set of all positive integers whose first digit is d, then A has no density, but nonetheless the second limit given above exists and is proportional to

${\displaystyle \log(d+1)-\log(d)=\log \left(1+{\frac {1}{d}}\right),\,}$

which is Benford's law.

## Infinite divisibility

The Zeta distribution can be constructed with a sequence of independent random variables with a Geometric distribution. Let ${\displaystyle p}$ be a prime number and ${\displaystyle X(p^{-s})}$ be a random variable with a Geometric distribution of parameter ${\displaystyle p^{-s}}$, namely

${\displaystyle \quad \quad \quad \mathbb {P} \left(X(p^{-s})=k\right)=p^{-ks}(1-p^{-s})}$

If the random variables ${\displaystyle (X(p^{-s}))_{p\in {\mathcal {P}}}}$ are independent, then, the random variable ${\displaystyle Z_{s}}$ defined by

${\displaystyle \quad \quad \quad Z_{s}=\prod _{p\in {\mathcal {P}}}p^{X(p^{-s})}}$

has the Zeta distribution : ${\displaystyle \mathbb {P} \left(Z_{s}=n\right)={\frac {1}{n^{s}\zeta (s)}}}$.

Stated differently, the random variable ${\displaystyle \log(Z_{s})=\sum _{p\in {\mathcal {P}}}X(p^{-s})\,\log(p)}$ is infinitely divisible with Lévy measure given by the following sum of Dirac masses :

${\displaystyle \quad \quad \quad \Pi _{s}(dx)=\sum _{p\in {\mathcal {P}}}\sum _{k\geqslant 1}{\frac {p^{-ks}}{k}}\delta _{k\log(p)}(dx)}$