Weber–Fechner law

Overview
The Weber–Fechner law attempts to describe the relationship between the physical magnitudes of stimuli and the perceived intensity of the stimuli. Ernst Heinrich Weber (1795–1878) was one of the first people to approach the study of the human response to a physical stimulus in a quantitative fashion. Gustav Theodor Fechner (1801–1887) later offered an elaborate theoretical interpretation of Weber's findings, which he called simply Weber's law, though his admirers made the law's name a hyphenate.

Background
Stevens' power law is sometimes considered more accurate and general, although both make assumptions about the measurement of perceived intensity. The Weber–Fechner law assumes that just noticeable differences are additive. L. L. Thurstone uses this assumption for the concept of discriminal dispersion in the Law of comparative judgment.

Fechner believed that Weber had discovered the fundamental principle of mind-body interaction, a mathematical analog of the function René Descartes once assigned to the pineal gland.

The case of weight
In one of his classic experiments, Weber gradually increased the weight that a blindfolded man was holding and asked him to respond when he first felt the increase. Weber found that the smallest noticable difference in weight (the least difference that the test person can still perceive as a difference), was proportional to the starting value of the weight. That is to say, if the weight is 1 kg, an increase of a few grams will not be noticed. Rather, when the mass is increased by a certain factor, an increase in weight is perceived. If the mass is doubled, the threshold called smallest noticable difference also doubles. This kind of relationship can be described by a differential equation as,


 * $$ dp = k\frac{dS}{S}, \,\!$$

where dp is the differential change in perception, dS is the differential increase in the stimulus and S is the stimulus at the instant. A constant factor k is to be determined experimentally.

Integrating the above equation gives


 * $$ p = k \ln{S} + C, \,\!$$

where $$C$$ is the constant of integration, ln is the natural logarithm.

To determine $$C$$, put $$p = 0$$, i.e. no perception; then


 * $$ C = -k\ln{S_0}, \,\!$$

where $$S_0$$ is that threshold of stimulus below which it is not perceived at all.

Therefore, our equation becomes


 * $$ p = k \ln{\frac{S}{S_0}}. \,\!$$

The relationship between stimulus and perception is logarithmic. This logarithmic relationship means that if a stimulus varies as a geometric progression (i.e. multiplied by a fixed factor), the corresponding perception is altered in an arithmetic progression (i.e. in additive constant amounts). For example, if a stimulus is tripled in strength (i.e, 3 x 1), the corresponding perception may be two times as strong as its original value (i.e., 1 + 1). If the stimulus is again tripled in strength (i.e., 3 x 3 x 1), the corresponding perception will be three times as strong as its original value (i.e., 1 + 1 + 1). Hence, for multiplications in stimulus strength, the strength of perception only adds.

This logarithmic relationship is valid, not just for the sensation of weight, but for other stimuli and our sensory perceptions as well.

In addition, the mathematical derivations of the torques on a simple beam balance produce a description that is strictly compatible with Weber's law (see link1 or link2).

The case of vision
The eye senses brightness logarithmically. Hence stellar magnitude is measured on a logarithmic scale. This magnitude scale was invented by the ancient Greek astronomer Hipparchus in about 150 B.C. He ranked the stars he could see in terms of their brightness, with 1 representing the brightest down to 6 representing the faintest, though now the scale has been extended beyond these limits. An increase in 5 magnitudes corresponds to a decrease in brightness by a factor of 100.

The case of sound
Still another logarithmic scale is the decibel scale of sound intensity. And yet another is pitch, which, however, differs from the other cases in that the physical quantity involved is not a "strength".

In the case of perception of pitch, humans hear pitch in a logarithmic or geometric ratio-based fashion: For notes spaced equally apart to the human ear, the frequencies are related by a multiplicative factor. For instance, the frequency of corresponding notes of adjacent octaves differ by a factor of 2. Similarly, the perceived difference in pitch between 100 Hz and 150 Hz is the same as between 1000 Hz and 1500 Hz. Musical scales are always based on geometric relationships for this reason. Notation and theory about music often refers to pitch intervals in an additive way, which makes sense if one considers the logarithms of the frequencies, as $$\log(a\times b)=\log a+\log b$$.

Loudness: Weber's law does not quite hold for loudness. Its a good approximation for higher amplitudes, but not for lower amplitudes. This is usually referred to as "near miss" to Weber's law