Beta function

In mathematics, the beta function, also called the Euler integral of the first kind, is a special function defined by



\mathrm{\Beta}(x,y) = \int_0^1t^{x-1}(1-t)^{y-1}\,dt \!$$

for Re(x), Re(y) > 0.

The beta function was studied by Euler and Legendre and was given its name by Jacques Binet.

Properties
The beta function is symmetric, meaning that



\mathrm{\Beta}(x,y) = \mathrm{\Beta}(y,x). \!$$

It has many other forms, including:



\mathrm{\Beta}(x,y)=\dfrac{\Gamma(x)\,\Gamma(y)}{\Gamma(x+y)} \!$$



\mathrm{\Beta}(x,y) = 2\int_0^{\pi/2}\sin^{2x-1}\theta\cos^{2y-1}\theta\,d\theta, \qquad \Re(x)>0,\ \Re(y)>0 \!$$



\mathrm{\Beta}(x,y) = \int_0^\infty\dfrac{t^{x-1}}{(1+t)^{x+y}}\,dt, \qquad \Re(x)>0,\ \Re(y)>0 \!$$



\mathrm{\Beta}(x,y) = \dfrac{1}{y}\sum_{n=0}^\infty(-1)^n\dfrac{(y)_{n+1}}{n!(x+n)} \!$$

where $$\Gamma(x)$$ is the gamma function and (x)n is the falling factorial; i.e., $$x(x - 1)(x - 2) ... (x - n + 1)$$. The second identity shows in particular $$\Gamma(1/2) = \sqrt \pi$$.

Just as the gamma function for integers describes factorials, the beta function can define a binomial coefficient after adjusting indices:
 * $${n \choose k} = \frac1{(n+1) \mathrm{B}(n-k+1, k+1)}$$

The beta function was the first known scattering amplitude in string theory, first conjectured by Gabriele Veneziano.

Relationship between Gamma function and Beta function
To derive the integral representation of the beta function, write the product of two factorials as



\mathrm{\Gamma(x)\Gamma(y)} = \int_0^\infty\ e^{-u} u^{x-1}\,du \int_0^\infty\ e^{-v} v^{y-1}\,dv. \!$$

Now, let $$u \equiv a^2$$, $$v \equiv b^2$$, so


 * $$\begin{align}

\mathrm{\Gamma(x)\Gamma(y)} &= 4\int_0^\infty\ e^{-a^2} a^{2x-1}\mathrm{d}a \int_0^\infty\ e^{-b^2} b^{2y-1}\,db \\ &= \int_{-\infty}^\infty\ \int_{-\infty}^\infty\ e^{-(a^2+b^2)} |a|^{2x-1} |b|^{2y-1} \,da \,db. \end{align} \!$$

Transforming to polar coordinates with $$a = r\cos\theta$$, $$b = r\sin\theta$$:


 * $$\begin{align}

\mathrm{\Gamma(x)\Gamma(y)} &= \int_0^{2\pi}\ \int_0^\infty\ e^{-r^2} |r\cos\theta|^{2x-1} |r\sin\theta|^{2y-1} r \, dr \,d\theta \\ &= \int_0^\infty\ e^{-r^2} r^{2x+2y-1} \, dr \int_0^{2\pi}\ |\cos^{2x-1}\theta \sin^{2y-1}\theta| \, d\theta \\ &= \frac{1}{2}\int_0^\infty\ e^{-r^2} r^{2(x+y-1)} \, d(r^2) 4\int_0^{\pi/2}\ \cos^{2x-1}\theta \sin^{2y-1} \theta \,d\theta \\ &= \Gamma(x+y) 2\int_0^{\pi/2}\ \cos^{2x-1}\theta \sin^{2y-1} \theta \, d\theta \\ &= \Gamma(x+y) \Beta(x, \, y). \end{align} \!$$

Hence, rewrite the arguments with the usual form of Beta function:



\Beta(x,\,y) = \frac{\Gamma(x)\,\Gamma(y)}{\Gamma(x+y)}.$$

A somewhat more straightforward derivation:

\begin{align} {\Gamma(x)\Gamma(y)} &= \int^\infty_0 t^{x-1} e^{-t} dt  \int^\infty_0 s^{y-1} e^{-s} ds =   \int^\infty_0 dt \! \int^\infty_0 ds \ t^{x-1} s^{y-1} e^{-(t{+}s)}= \end{align} $$ the argument in the exponential inspires us to employ the substitution

$$ \begin{align} \begin{array}{l} \sigma=s{+}t \\ \tau=t \end{array} \quad \Rightarrow \quad |\mathrm{J}|=1 \end{align} $$

using which we arrive to:

$$ \begin{align} =\int^\infty_0 d\sigma \! \int^\sigma_0 d\tau \ \tau^{x{-}1} (\sigma{-}\tau)^{y-1} e^{-\sigma} =\int^\infty_0 d\sigma \! \int^\sigma_0 d\tau \ \tau^{x{-}1} \, \sigma^{y{-}1} \, \Big(1{-}\frac{\tau}{\sigma}\Big)^{y{-}1} e^{-\sigma}= \end{align} $$

again, now the comparison to $$B(x,y)$$ leads us to:

$$ \begin{align} r=\displaystyle \frac{\tau}{\sigma} \, \       q=\sigma \quad \text{where the Jacobian is now:} \quad |\mathrm{J}|=q \end{align} $$

which leads to an easy identification with the expected result:

$$ \begin{align}

=&\int^\infty_0 \! dq \int^1_0 dr \, q \ (rq)^{x{-}1} \,q^{y{-}1} \,  (1{-}r)^{y{-}1} \, e^{-q} =\int^\infty_0 \! dq \int^1_0 dr \ r^{x{-}1} \,(1{-}r)^{y{-}1} \, q^{x{+}y{-}1} \, e^{-q} = \\ =& \int^\infty_0 q^{x{+}y{-}1} \, e^{-q} \,dq \ \int^1_0  r^{x{-}1} (1{-}r)^{y{-}1}dr = \Gamma(x{+}y) \, B(x,y) \. \end{align} $$

Derivatives
The derivatives follow:


 * $${\partial \over \partial x} \mathrm{B}(x, y) = \mathrm{B}(x, y) \left( {\Gamma'(x) \over \Gamma(x)} - {\Gamma'(x + y) \over \Gamma(x + y)} \right) = \mathrm{B}(x, y) (\psi(x) - \psi(x + y))$$

where $$\psi(x)$$ is the digamma function.

Integrals
The Nörlund-Rice integral is a contour integral involving the beta function.

Approximation
Stirling's approximation gives the asymptotic formula


 * $$\Beta(x,\,y) \sim \sqrt {2\pi } \frac.$$

Incomplete beta function
The incomplete beta function is a generalization of the beta function that replaces the definite integral of the beta function with an indefinite integral. The situation is analogous to the incomplete gamma function being a generalization of the gamma function.

The incomplete beta function is defined as


 * $$ \Beta(x;\,a,b) = \int_0^x t^{a-1}\,(1-t)^{b-1}\,dt. \!$$

For x = 1, the incomplete beta function coincides with the complete beta function.

The regularized incomplete beta function (or regularized beta function for short) is defined in terms of the incomplete beta function and the complete beta function:


 * $$ I_x(a,b) = \dfrac{\Beta(x;\,a,b)}{\Beta(a,b)}. \!$$

Working out the integral for integer values of a and b, one finds:


 * $$ I_x(a,b) = \sum_{j=a}^{a+b-1} {(a+b-1)! \over j!(a+b-1-j)!} x^j (1-x)^{a+b-1-j}. $$

Properties

 * $$ I_0(a,b) = 0 \, $$
 * $$ I_1(a,b) = 1 \, $$
 * $$ I_x(a,b) = 1 - I_{1-x}(b,a) \, $$

(Many other properties could be listed here.)