Maxwell's theorem

In probability theory, Maxwell's theorem, named in honor of James Clerk Maxwell, states that if the probability distribution of a vector-valued random variable X = ( X1, ..., Xn )T is the same as the distribution of GX for every n&times;n orthogonal matrix G and the components are independent, then the components X1, ..., Xn are normally distributed with expected value 0, all have the same variance, and all are independent. This theorem is one of many characterizations of the normal distribution.

Since a multiplication by an orthogonal matrix is a rotation, the theorem says that if the probability distribution of a random vector is unchanged by rotations and the components are independent, then the components are identically distributed and normally distributed. In other words, the only rotationally invariant probability distributions on Rn that have independent components are multivariate normal distributions with expected value 0 and variance &sigma;2In, (where In = the n&times;n identity matrix), for some positive number &sigma;2.