Copula (statistics)

In statistics, a copula is a multivariate joint distribution defined on the n-dimensional unit cube [0, 1]n such that every marginal distribution is uniform on the interval [0, 1].

Specifically, $$C:[0,1]^n\to [0,1]$$ is an n-dimensional copula (briefly, n-copula) if:


 * $$ C(\mathbf u)=0$$ whenever $$\mathbf u\in [0,1]^n$$ has at least one component equal to $$0;$$


 * $$ C(\mathbf u)=u_i$$ whenever $$\mathbf u\in [0,1]^n$$ has all the components equal to $$1$$ except the i-th one, which is equal to $$u_i;$$


 * $$ C(\mathbf u) $$ is n-increasing, i.e., for each $$B=\times_{i=1}^{n}[a_i,b_i]\subseteq [0,1]^n;$$


 * $$ V_{C}\left( B\right):=\sum_{\mathbf z\in \times_{i=1}^{n}\{x_i,y_i\}} (-1)^{N(\mathbf z)} C(\mathbf z)\ge 0;$$

where the $$N(\mathbf z)=\operatorname{card}\{k\mid z_k=x_k\}$$.

Sklar's theorem
The theorem proposed by Sklar underlies most applications of the copula. Sklar's theorem states that given a joint distribution function H for p variables, and respective marginal distribution functions, there exists a copula C such that the copula binds the margins to give the joint distribution.

For the bivariate case, Sklar's theorem can be stated as follows. For any bivariate distribution function H(x, y), let F(x) = H(x, (&minus;∞,∞)) and G(y) = H((&minus;∞,∞), y) be the univariate marginal probability distribution functions. Then there exists a copula C such that


 * $$H(x,y)=C(F(x),G(y))\,$$

(where we have identified the distribution C with its cumulative distribution function). Moreover, if marginal distributions, say, F(x) and G(y), are continuous, the copula function C is unique. Otherwise, the copula C is unique on the range of values of the marginal distributions.

Fréchet-Hoeffding copula boundaries


Minimum copula: This is the lower bound for all copulas. In the bivariate case only, it represents perfect negative dependence between variates.


 * $$ W(u,v) = \max(0,u+v-1).\,$$

For $$n$$-variate copulas, the lower bound is given by
 * $$ W(u_1,\ldots,u_n) := \max\left\{1-n+\sum\limits_{i=1}^n {u_i}, 0 \right\} \leq C(u_1,\ldots,u_n) $$

Maximum copula: This is the upper bound for all copulas. It represents perfect positive dependence between variates:


 * $$ M(u,v) = \min(u,v).\,$$

For $$n$$-variate copulas, the upper bound is given by
 * $$C(u_1,\ldots,u_n)\le \min_{j \in \{1,\ldots,n\}} u_j =: M(u_1,\ldots,u_n)$$

Conclusion: For all copulas $$C(u,v)$$,
 * $$ W(u,v) \le C(u,v) \le M(u,v)$$

In the multivariate case, the corresponding inequality is
 * $$ W(u_1,\ldots,u_n) \le C(u_1,\ldots,u_n) \le M(u_1,\ldots,u_n)$$

Gaussian copula


One example of a copula often used for modelling in finance is the Gaussian Copula, which is constructed from the bivariate normal distribution via Sklar's theorem. For X and Y distributed as standard bivariate normal with correlation ρ the Gaussian copula function is


 * $$ C_\rho(u,v) = \Phi_{U,V, \rho} \left(\Phi^{-1}(u), \Phi^{-1}(v) \right) $$

where the marginals of U and V are N(0,1) distributions and Φ denotes the cumulative normal density. Differentiating this yields


 * $$ c_\rho(u,v) = \frac{\phi_{U,V, \rho} (\Phi_X^{-1}(u), \Phi_Y^{-1}(v) )}

{\phi(\Phi^{-1}(u)) \phi(\Phi^{-1}(v))}$$

where


 * $$ \phi_{X,Y, \rho}(x,y) = \frac{1}{2 \pi\sqrt{1-\rho^2}} \exp \left (-

\frac{1}{2(1-\rho^2)} \left [{x^2+y^2} -2\rho xy  \right ] \right ) $$

is the density function for the bivariate normal variate with Pearson's product moment correlation coefficient $$\rho$$, φ is the density of the N(0,1) distribution (the marginal density).

Archimedean copulas
One particularly simple form of a copula is


 * $$ H(x,y) = \Psi^{-1}(\Psi(F(x))+\Psi(G(y)))\,$$

where $$\psi$$ is known as a generator function. Such copulas are known as Archimedean. Any generator function which satisfies the properties below is the basis for a valid copula:


 * $$\Psi(1) = 0;\qquad \lim_{x \to 0}\Psi(x) = \infty;\qquad \Psi'(x) < 0;\qquad \Psi''(x) > 0. $$

Product copula: Also called the independent copula, this copula has no dependence between variates. Its density function is unity everywhere.


 * $$\Psi(x) = -\ln(x); \qquad H(x,y) = F(x)G(y).$$

Where the generator function is indexed by a parameter, a whole family of copulas may be Archimedean. For example:

Clayton copula:


 * $$\Psi(x) = x^{\theta} -1;\qquad \theta \le 0; \qquad H(x,y) = (F(x)^\theta+G(y)^\theta-1)^{1/\theta}.$$

For &theta; = 0 in the Clayton copula, the random variables are statistically independent. The generator function approach can be extended to create multivariate copulas, by simply including more additive terms.

General

 * David G. Clayton (1978), "A model for association in bivariate life tables and its application in epidemiological studies of familial tendency in chronic disease incidence", Biometrika 65, 141-151. JSTOR (subscription)
 * Frees, E.W., Valdez, E.A. (1998), "Understanding Relationships Using Copulas", North American Actuarial Journal 2, 1-25. Link to NAAJ copy
 * Roger B. Nelsen (1999), An Introduction to Copulas. ISBN 0-387-98623-5.
 * S. Rachev, C. Menn, F. Fabozzi (2005), Fat-Tailed and Skewed Asset Return Distributions. ISBN 0-471-71886-6.
 * A. Sklar (1959), "Fonctions de répartition à n dimensions et leurs marges", Publications de l'Institut de Statistique de L'Université de Paris 8, 229-231.
 * W.T. Shaw, K.T.A. Lee (2006), "Copula Methods vs Canonical Multivariate Distributions: The Multivariate Student T Distibution with General Degrees of Freedom". PDF