Heteroscedasticity-consistent standard errors

In statistics, a frequent assumption in linear regression is that the disturbances ui have the same variance. When this is not the case, we get heteroskedasticity in the estimated residuals $$\scriptstyle\widehat{u_i} $$. Heteroskedasticity-consistent (HC) standard errors are used to dealing with this problem by producing more normal distribution normally-distributed standard errors. The first model was proposed by White (1980), and further improved models have been produced since for cross-sectional data, time-series data and GARCH estimation.

Definition
Assume that we are regressing the linear regression model



y = X \beta + u, $$

where X is the design matrix and &beta; is a column vector of parameters to be estimated.

The ordinary least squares (OLS) estimator is



\widehat \beta = (X' X)^{-1} X' y. $$

If the residuals all have the same variance &sigma;2 and are uncorrelated, then the least-squares estimates of &beta; satisfy the assumption of being BLUE. If they are not BLUE, then suppose they have variances &sigma;i2 and the OLS variance estimator is


 * $$ \widehat{\sigma}^2 = {{\widehat{u}' \widehat{u}} \over {n-k} }, $$

where $$\scriptstyle \widehat{u}\, =\, (I - X (X'X) X' y). $$ There are many kinds of heteroskedasticity and imagination is the only limit to think of what type is possible.

HC estimators are recommended to deal with this problem.

White's heteroskedasticity-consistent estimator
White's (1980) HC estimator, often referred to as HC0, has the estimator



E(\widehat{u} \widehat u') = \operatorname{diag}(\widehat{u}^2_1, \widehat{u}^2_2, \dots, \widehat{u}^2_n). $$