HOME

TheInfoList



OR:

200px, Josiah Willard Gibbs In statistical mechanics, the Gibbs algorithm, introduced by
J. Willard Gibbs Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in t ...
in 1902, is a criterion for choosing a
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomeno ...
for the
statistical ensemble In physics, specifically statistical mechanics, an ensemble (also statistical ensemble) is an idealization consisting of a large number of virtual copies (sometimes infinitely many) of a system, considered all at once, each of which represents ...
of microstates of a
thermodynamic system A thermodynamic system is a body of matter and/or radiation, confined in space by walls, with defined permeabilities, which separate it from its surroundings. The surroundings may include other thermodynamic systems, or physical systems that are ...
by minimizing the average log probability : \langle\ln p_i\rangle = \sum_i p_i \ln p_i \, subject to the probability distribution satisfying a set of constraints (usually expectation values) corresponding to the known
macroscopic The macroscopic scale is the length scale on which objects or phenomena are large enough to be visible with the naked eye, without magnifying optical instruments. It is the opposite of microscopic. Overview When applied to physical phenomena ...
quantities. in 1948,
Claude Shannon Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, and cryptographer known as a "father of information theory". As a 21-year-old master's degree student at the Massachusetts In ...
interpreted the negative of this quantity, which he called
information entropy In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable X, which takes values in the alphabet \ ...
, as a measure of the uncertainty in a probability distribution. In 1957, E.T. Jaynes realized that this quantity could be interpreted as missing information about anything, and generalized the Gibbs algorithm to non-equilibrium systems with the
principle of maximum entropy The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precisely stated prior data (such as a proposition ...
and maximum entropy thermodynamics. Physicists call the result of applying the Gibbs algorithm the
Gibbs distribution In statistical mechanics and mathematics, a Boltzmann distribution (also called Gibbs distribution Translated by J.B. Sykes and M.J. Kearsley. See section 28) is a probability distribution or probability measure that gives the probability ...
for the given constraints, most notably Gibbs's
grand canonical ensemble In statistical mechanics, the grand canonical ensemble (also known as the macrocanonical ensemble) is the statistical ensemble that is used to represent the possible states of a mechanical system of particles that are in thermodynamic equilibri ...
for open systems when the average energy and the average number of particles are given. (See also '' partition function''). This general result of the Gibbs algorithm is then a
maximum entropy probability distribution In statistics and information theory, a maximum entropy probability distribution has entropy that is at least as great as that of all other members of a specified class of probability distributions. According to the principle of maximum entropy, ...
. Statisticians identify such distributions as belonging to exponential families.


References

Statistical mechanics Particle statistics Entropy and information {{statisticalmechanics-stub