# Skellam distribution

Parameters Probability mass function Examples of the probability mass function for the Skellam distribution. The horizontal axis is the index k. (Note that the function is only defined at integer values of k. The connecting lines do not indicate continuity.) Cumulative distribution function $\mu _{1}\geq 0,~~\mu _{2}\geq 0$ $\{\ldots ,-2,-1,0,1,2,\ldots \}$ $e^{-(\mu _{1}\!+\!\mu _{2})}\left({\frac {\mu _{1}}{\mu _{2}}}\right)^{k/2}\!\!I_{k}(2{\sqrt {\mu _{1}\mu _{2}}})$ $\mu _{1}-\mu _{2}\,$ N/A $\mu _{1}+\mu _{2}\,$ ${\frac {\mu _{1}-\mu _{2}}{(\mu _{1}+\mu _{2})^{3/2}}}$ $1/(\mu _{1}+\mu _{2})\,$ $e^{-(\mu _{1}+\mu _{2})+\mu _{1}e^{t}+\mu _{2}e^{-t}}$ $e^{-(\mu _{1}+\mu _{2})+\mu _{1}e^{it}+\mu _{2}e^{-it}}$ The Skellam distribution is the discrete probability distribution of the difference $K_{1}-K_{2}$ of two correlated or uncorrelated random variables $K_{1}$ and $K_{2}$ having Poisson distributions with different expected values $\mu _{1}$ and $\mu _{2}$ . It is useful in describing the statistics of the difference of two images with simple photon noise, as well as describing the point spread distribution in certain sports where all scored points are equal, such as baseball, hockey and soccer.

Only the case of uncorrelated variables will be considered in this article. See Karlis & Ntzoufras, 2003 for the use of the Skellam distribution to describe the difference of correlated Poisson-distributed variables.

Note that the probability mass function of a Poisson distribution with mean μ is given by

$f(k;\mu )={\mu ^{k} \over k!}e^{-\mu }\,$ The Skellam probability mass function is the cross-correlation of two Poisson distributions: (Skellam, 1946)

$f(k;\mu _{1},\mu _{2})=\sum _{n=-\infty }^{\infty }\!f(k\!+\!n;\mu _{1})f(n;\mu _{2})$ $=e^{-(\mu _{1}+\mu _{2})}\sum _{n=-\infty }^{\infty }{{\mu _{1}^{k+n}\mu _{2}^{n}} \over {n!(k+n)!}}$ $=e^{-(\mu _{1}+\mu _{2})}\left({\mu _{1} \over \mu _{2}}\right)^{k/2}I_{k}(2{\sqrt {\mu _{1}\mu _{2}}})$ where I k(z) is the modified Bessel function of the first kind. The above formulas have assumed that any term with a negative factorial is set to zero. The special case for $\mu _{1}=\mu _{2}(=\mu )$ is given by (Irwin, 1937):

$f\left(k;\mu ,\mu \right)=e^{-2\mu }I_{k}(2\mu )$ Note also that, using the limiting values of the Bessel function for small arguments, we can recover the Poisson distribution as a special case of the Skellam distribution for $\mu _{2}=0$ .

## Properties

The Skellam probability mass function is of course normalized:

$\sum _{k=-\infty }^{\infty }f(k;\mu _{1},\mu _{2})=1.$ We know that the generating function for a Poisson distribution is:

$G\left(t;\mu \right)=e^{\mu (t-1)}.$ It follows that the generating function $G(t;\mu _{1},\mu _{2})$ for a Skellam probability function will be:

$G(t;\mu _{1},\mu _{2})=\sum _{k=0}^{\infty }f(k;\mu _{1},\mu _{2})t^{k}$ $=G\left(t;\mu _{1}\right)G\left(1/t;\mu _{2}\right)\,$ $=e^{-(\mu _{1}+\mu _{2})+\mu _{1}t+\mu _{2}/t}.$ Notice that the form of the generating function implies that the distribution of the sums or the differences of any number of independent Skellam-distributed variables are again Skellam-distributed.

It is sometimes claimed that any linear combination of two Skellam-distributed variables are again Skellam-distributed, but this is clearly not true since any multiplier other than +/-1 would change the support of the distribution.

The moment-generating function is given by:

$M\left(t;\mu _{1},\mu _{2}\right)=G(e^{t};\mu _{1},\mu _{2})$ $=\sum _{k=0}^{\infty }{t^{k} \over k!}\,m_{k}$ which yields the raw moments mk . Define:

$\Delta \ {\stackrel {\mathrm {def} }{=}}\ \mu _{1}-\mu _{2}\,$ $\mu \ {\stackrel {\mathrm {def} }{=}}\ (\mu _{1}+\mu _{2})/2.\,$ Then the raw moments mk are

$m_{1}=\left.\Delta \right.\,$ $m_{2}=\left.2\mu +\Delta ^{2}\right.\,$ $m_{3}=\left.\Delta (1+6\mu +\Delta ^{2})\right.\,$ The central moments M k are

$M_{2}=\left.2\mu \right.,\,$ $M_{3}=\left.\Delta \right.,\,$ $M_{4}=\left.2\mu +12\mu ^{2}\right..\,$ The mean, variance, skewness, and kurtosis excess are respectively:

$\left.\right.E(n)=\Delta \,$ $\sigma ^{2}=\left.2\mu \right.\,$ $\gamma _{1}=\left.\Delta /(2\mu )^{3/2}\right.\,$ $\gamma _{2}=\left.1/2\mu \right..\,$ The cumulant-generating function is given by:

$K(t;\mu _{1},\mu _{2})\ {\stackrel {\mathrm {def} }{=}}\ \ln(M(t;\mu _{1},\mu _{2}))=\sum _{k=0}^{\infty }{t^{k} \over k!}\,\kappa _{k}$ which yields the cumulants:

$\kappa _{2k}=\left.2\mu \right.$ $\kappa _{2k+1}=\left.\Delta \right..$ For the special case when μ1 = μ2, an asymptotic expansion of the modified Bessel function of the first kind yields for large μ:

$f(k;\mu ,\mu )\sim {1 \over {\sqrt {4\pi \mu }}}\left[1+\sum _{n=1}^{\infty }(-1)^{n}{\{4k^{2}-1^{2}\}\{4k^{2}-3^{2}\}\cdots \{4k^{2}-(2n-1)^{2}\} \over n!\,2^{3n}\,(2\mu )^{n}}\right]$ (Abramowitz & Stegun 1972, p. 377). Also, for this special case, when k is also large, and of order of the square root of 2μ, the distribution tends to a normal distribution:

$f(k;\mu ,\mu )\sim {e^{-k^{2}/4\mu } \over {\sqrt {4\pi \mu }}}.$ These special results can easily be extended to the more general case of different means. 