Zeta distribution

In probability theory and statistics, the zeta distribution is a discrete probability distribution. If X is a zeta-distributed random variable with parameter s, then the probability that X takes the integer value k is given by the probability mass function


 * $$f_s(k)=k^{-s}/\zeta(s)\,$$

where &zeta;(s) is the Riemann zeta function (which is undefined for s = 1).

The multiplicities of distinct prime factors of X are independent random variables.

The zeta distribution is equivalent to the Zipf distribution for infinite N. Indeed the terms "Zipf distribution" and the "zeta distribution" are often used interchangeably.

Moments
The nth raw moment is defined as the expected value of Xn:


 * $$m_n = E(X^n) = \frac{1}{\zeta(s)}\sum_{k=1}^\infty \frac{1}{k^{s-n}}$$

The series on the right is just a series representation of the Riemann zeta function, but it only converges for values of s-n that are greater than unity. Thus:


 * $$m_n =\left\{

\begin{matrix} \zeta(s-n)/\zeta(s) & \textrm{for}~n < s-1 \\ \infty & \textrm{for}~n \ge s-1 \end{matrix} \right. $$

Note that the ratio of the zeta functions is well defined, even for n &ge; s &minus; 1 because the series representation of the zeta function can be analytically continued. This does not change the fact that the moments are specified by the series itself, and are therefore undefined for large n.

Moment generating function
The moment generating function is defined as


 * $$M(t;s) = E(e^{tX}) = \frac{1}{\zeta(s)} \sum_{k=1}^\infty \frac{e^{tk}}{k^s}.$$

The series is just the definition of the polylogarithm, valid for $$e^t<1$$ so that


 * $$M(t;s) = \frac{\operatorname{Li}_s(e^t)}{\zeta(s)}\text{ for }t<0.$$

The Taylor series expansion of this function will not necessarily yield the moments of the distribution. The Taylor series using the moments as they usually occur in the moment generating function yields


 * $$\sum_{n=0}^\infty \frac{m_n t^n}{n!},$$

which obviously is not well defined for any finite value of s since the moments become infinite for large n. If we use the analytically continued terms instead of the moments themselves, we obtain from a series representation of the polylogarithm


 * $$\frac{1}{\zeta(s)}\sum_{n=0,n\ne s-1}^\infty \frac{\zeta(s-n)}{n!}\,t^n=\frac{\operatorname{Li}_s(e^t)-\Phi(s,t)}{\zeta(s)}$$

for $$\scriptstyle |t|\,<\,2\pi$$. $$\scriptstyle\Phi(s,t)$$ is given by


 * $$\Phi(s,t)=\Gamma(1-s)(-t)^{s-1}\text{ for }s\ne 1,2,3\ldots$$
 * $$\Phi(s,t)=\frac{t^{s-1}}{(s-1)!}\left[H_s-\ln(-t)\right]\text{ for }s=2,3,4\ldots$$
 * $$\Phi(s,t)=-\ln(-t)\text{ for }s=1,\,$$

where Hs is a harmonic number.

The case s = 1
&zeta;(1) is infinite as the harmonic series, and so the case when s = 1 is not meaningful. However, if A is any set of positive integers that has a density, i.e. if


 * $$\lim_{n\rightarrow\infty}\frac{N(A,n)}{n}$$

exists where N(A, n) is the number of members of A less than or equal to n, then


 * $$\lim_{s\rightarrow 1+}P(X\in A)\,$$

is equal to that density.

The latter limit can also exist in some cases in which A does not have a density. For example, if A is the set of all positive integers whose first digit is d, then A has no density, but nonetheless the second limit given above exists and is proportional to


 * $$\log(d+1) - \log(d),\,$$

similar to Benford's law.