# Logit

*The logit function is an important part of logistic regression: for more information, please see that article.*

In mathematics, especially as applied in statistics, the **logit** (pronounced with a long "o" and a soft "g", IPA /loʊdʒɪt/) of a number *p* between 0 and 1 is

(The base of the logarithm function used here is of little importance in the present article, as long as it is greater than 1, but the natural logarithm with base e is often used.) The logit function is the inverse of the "sigmoid", or "logistic" function.
If *p* is a probability then *p*/(1 − *p*) is the corresponding odds, and the logit of the probability is the logarithm of the odds; similarly the difference between the logits of two probabilities is the logarithm of the odds ratio (OR), thus providing an additive mechanism for combining odds-ratios:

## History

The logit model was introduced by Joseph Berkson in 1944, who coined the term. The term was borrowed by analogy from the very similar probit model developed by Chester Ittner Bliss in 1934. G. A. Barnard in 1949 coined the commonly used term *log-odds*; the log-odds of an event is the logit of the probability of the event.

## Uses and properties

- The
**logit**in logistic regression is a special case of a link function in a generalized linear model. - The
**logit**function is the negative of the derivative of the binary entropy function. - The
**logit**is also central to the probabilistic Rasch model for measurement, which has applications in psychological and educational assessment, among other areas.

## See also

- Daniel McFadden, a Nobel prize winner for development of a particular logit model used in economics
- Logit analysis in marketing
- Perceptron
- Probit