Bernoulli process

In probability and statistics, a Bernoulli process is a discrete-time stochastic process consisting of a sequence of independent random variables taking values over two symbols. Prosaically, a Bernoulli process is coin flipping, possibly with an unfair coin. A variable in such a sequence may be called a Bernoulli variable.

Definition
A Bernoulli process is a discrete-time stochastic process consisting of a finite or infinite sequence of independent random variables X1, X2, X3,..., such that


 * For each i, the value of Xi is either 0 or 1;
 * For all values of i, the probability that Xi = 1 is the same number p.

In other words, a Bernoulli process is a sequence of independent identically distributed Bernoulli trials. The two possible values of each Xi are often called "success" and "failure", so that, when expressed as a number, 0 or 1, the value is said to be the number of successes on the ith "trial". The individual success/failure variables Xi are also called Bernoulli trials.

Independence of Bernoulli trials implies memorylessness property: past trials do not provide any information regarding future outcomes. From any given time, future trials is also a Bernoulli process independent of the past (fresh-start property).

Random variables associated with the Bernoulli process include


 * The number of successes in the first n trials; this has a binomial distribution;
 * The number of trials needed to get r successes; this has a negative binomial distribution.
 * The number of trials needed to get one success; this has a geometric distribution, which is a special case of the negative binomial distribution.

The problem of determining the process, given only a limited sample of Bernoulli trials, is known as the problem of checking if a coin is fair.

Formal definition
The Bernoulli process can be formalized in the language of probability spaces. A Bernoulli process is then a probability space $$(\Omega, Pr)$$ together with a random variable X over the set $$\{0,1\}$$, so that for every $$\omega \in\Omega$$, one has $$X_i(\omega)=1$$ with probability p and $$X_i(\omega)=0$$ with probability 1-p.

Bernoulli sequence
Given a Bernoulli process defined on a probability space $$(\Omega, Pr)$$, then associated with every $$\omega \in \Omega$$ is a sequence of integers


 * $$\mathbb{Z}^\omega = \{n\in \mathbb{Z} : X_n(\omega) = 1 \}$$

which is called the Bernoulli sequence. So, for example, if $$\omega$$ represents a sequence of coin flips, then the Bernoulli sequence is the list of integers for which the coin toss came out heads.

Almost all Bernoulli sequences are ergodic sequences.

Bernoulli map
Because every trial has one of two possible outcomes, a sequence of trials may be represented by the binary digits of a real number. When the probability p = 1/2, all possible distributions are equally likely, and thus the measure of the &sigma;-algebra of the Bernoulli process is equivalent to the uniform measure on the unit interval: in other words, the real numbers are distributed uniformly on the unit interval.

The shift operator T taking each random variable to the next,


 * $$TX_i=X_{i+1}$$

is then given by the Bernoulli map or the 2x mod 1 map


 * $$b(z)=2z-\lfloor 2z \rfloor$$

where $$z\in[0,1]$$ represents a given sequence of measurements, and $$\lfloor z \rfloor$$ is the floor function, the largest integer less than z. The Bernoulli map essentially lops off one digit of the binary expansion of z.

The Bernoulli map is an exactly solvable model of deterministic chaos. The transfer operator, or Frobenius-Perron operator, of the Bernoulli map is solvable; the eigenvalues are multiples of 1/2, and the eigenfunctions are the Bernoulli polynomials.

Generalizations
The generalization of the Bernoulli process to more than two possible outcomes is called the Bernoulli scheme.