HOME

TheInfoList



OR:

In
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic be ...
and
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, a Boltzmann distribution (also called Gibbs distribution Translated by J.B. Sykes and M.J. Kearsley. See section 28) is a probability distribution or probability measure that gives the probability that a system will be in a certain
state State may refer to: Arts, entertainment, and media Literature * ''State Magazine'', a monthly magazine published by the U.S. Department of State * ''The State'' (newspaper), a daily newspaper in Columbia, South Carolina, United States * ''Our S ...
as a function of that state's energy and the temperature of the system. The distribution is expressed in the form: :p_i \propto e^ where is the probability of the system being in state , is the energy of that state, and a constant of the distribution is the product of the
Boltzmann constant The Boltzmann constant ( or ) is the proportionality factor that relates the average relative kinetic energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constant, ...
and
thermodynamic temperature Thermodynamic temperature is a quantity defined in thermodynamics as distinct from kinetic theory or statistical mechanics. Historically, thermodynamic temperature was defined by Kelvin in terms of a macroscopic relation between thermodynamic wor ...
. The symbol \propto denotes proportionality (see for the proportionality constant). The term ''system'' here has a very wide meaning; it can range from a collection of 'sufficient number' of atoms or a single atom to a macroscopic system such as a natural gas storage tank. Therefore the Boltzmann distribution can be used to solve a very wide variety of problems. The distribution shows that states with lower energy will always have a higher probability of being occupied. The ''ratio'' of probabilities of two states is known as the Boltzmann factor and characteristically only depends on the states' energy difference: :\frac = e^ The Boltzmann distribution is named after
Ludwig Boltzmann Ludwig Eduard Boltzmann (; 20 February 1844 – 5 September 1906) was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodyn ...
who first formulated it in 1868 during his studies of the
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic be ...
of gases in
thermal equilibrium Two physical systems are in thermal equilibrium if there is no net flow of thermal energy between them when they are connected by a path permeable to heat. Thermal equilibrium obeys the zeroth law of thermodynamics. A system is said to be in ...
. Boltzmann's statistical work is borne out in his paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium" The distribution was later investigated extensively, in its modern generic form, by
Josiah Willard Gibbs Josiah Willard Gibbs (; February 11, 1839 – April 28, 1903) was an American scientist who made significant theoretical contributions to physics, chemistry, and mathematics. His work on the applications of thermodynamics was instrumental in t ...
in 1902. The Boltzmann distribution should not be confused with the
Maxwell–Boltzmann distribution In physics (in particular in statistical mechanics), the Maxwell–Boltzmann distribution, or Maxwell(ian) distribution, is a particular probability distribution named after James Clerk Maxwell and Ludwig Boltzmann. It was first defined and use ...
or Maxwell-Boltzmann statistics. The Boltzmann distribution gives the probability that a system will be in a certain ''state'' as a function of that state's energy,Atkins, P. W. (2010) Quanta, W. H. Freeman and Company, New York while the Maxwell-Boltzmann distributions give the probabilities of particle ''speeds'' or ''energies'' in ideal gases. The distribution of energies in a one-dimensional gas however, does follow the Boltzmann distribution.


The distribution

The Boltzmann distribution is a probability distribution that gives the probability of a certain state as a function of that state's energy and temperature of the
system A system is a group of Interaction, interacting or interrelated elements that act according to a set of rules to form a unified whole. A system, surrounded and influenced by its environment (systems), environment, is described by its boundaries, ...
to which the distribution is applied. It is given as : p_i=\frac} {e^{- {\varepsilon}_i / (k T)}=\frac{e^{- {\varepsilon}_i / (k T){\sum_{j=1}^{M}{e^{- {\varepsilon}_j / (k T)} where ''pi'' is the probability of state ''i'', ''εi'' the energy of state ''i'', ''k'' the Boltzmann constant, ''T'' the absolute temperature of the system and ''M'' is the number of all states accessible to the system of interest. The normalization denominator ''Q'' (denoted by some authors by ''Z'') is the
canonical partition function The adjective canonical is applied in many contexts to mean "according to the canon" the standard, rule or primary source that is accepted as authoritative for the body of knowledge or literature in that context. In mathematics, "canonical exampl ...
: Q={\sum_{i=1}^{M}{e^{- {\varepsilon}_i / (k T)} It results from the constraint that the probabilities of all accessible states must add up to 1. The Boltzmann distribution is the distribution that maximizes the
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
:H(p_1,p_2,\cdots,p_M) = -\sum_{i=1}^{M} p_i\log_2 p_i subject to the normalization constraint and the constraint that \sum {p_i {\varepsilon}_i} equals a particular mean energy value (which can be proven using
Lagrange multipliers In mathematical optimization, the method of Lagrange multipliers is a strategy for finding the local maxima and minima of a function subject to equality constraints (i.e., subject to the condition that one or more equations have to be satisfied e ...
). The partition function can be calculated if we know the energies of the states accessible to the system of interest. For atoms the partition function values can be found in the NIST Atomic Spectra Database. The distribution shows that states with lower energy will always have a higher probability of being occupied than the states with higher energy. It can also give us the quantitative relationship between the probabilities of the two states being occupied. The ratio of probabilities for states ''i'' and ''j'' is given as : {\frac{p_i}{p_j=e^{({\varepsilon}_j-{\varepsilon}_i) / (k T)} where ''pi'' is the probability of state ''i'', ''pj'' the probability of state ''j'', and ''εi'' and ''εj'' are the energies of states ''i'' and ''j'', respectively. The corresponding ratio of populations of energy levels must also take their degeneracies into account. The Boltzmann distribution is often used to describe the distribution of particles, such as atoms or molecules, over bound states accessible to them. If we have a system consisting of many particles, the probability of a particle being in state ''i'' is practically the probability that, if we pick a random particle from that system and check what state it is in, we will find it is in state ''i''. This probability is equal to the number of particles in state ''i'' divided by the total number of particles in the system, that is the fraction of particles that occupy state ''i''. : p_i={\frac{N_i}{N where ''Ni'' is the number of particles in state ''i'' and ''N'' is the total number of particles in the system. We may use the Boltzmann distribution to find this probability that is, as we have seen, equal to the fraction of particles that are in state i. So the equation that gives the fraction of particles in state ''i'' as a function of the energy of that state is : {\frac{N_i}{N={\frac{e^{- {\varepsilon}_i / (k T){\sum_{j=1}^{M}{e^{- {\varepsilon}_j / (k T) This equation is of great importance to
spectroscopy Spectroscopy is the field of study that measures and interprets the electromagnetic spectra that result from the interaction between electromagnetic radiation and matter as a function of the wavelength or frequency of the radiation. Matter wa ...
. In spectroscopy we observe a
spectral line A spectral line is a dark or bright line in an otherwise uniform and continuous spectrum, resulting from emission or absorption of light in a narrow frequency range, compared with the nearby frequencies. Spectral lines are often used to iden ...
of atoms or molecules undergoing transitions from one state to another. In order for this to be possible, there must be some particles in the first state to undergo the transition. We may find that this condition is fulfilled by finding the fraction of particles in the first state. If it is negligible, the transition is very likely not observed at the temperature for which the calculation was done. In general, a larger fraction of molecules in the first state means a higher number of transitions to the second state. This gives a stronger spectral line. However, there are other factors that influence the intensity of a spectral line, such as whether it is caused by an allowed or a
forbidden transition In spectroscopy, a forbidden mechanism (forbidden transition or forbidden line) is a spectral line associated with absorption or emission of photons by atomic nuclei, atoms, or molecules which undergo a transition that is not allowed by a particul ...
. The
softmax function The softmax function, also known as softargmax or normalized exponential function, converts a vector of real numbers into a probability distribution of possible outcomes. It is a generalization of the logistic function to multiple dimensions, a ...
commonly used in machine learning is related to the Boltzmann distribution: : (p_1, \ldots, p_M) = \operatorname{softmax}(- {\varepsilon}_1 / (k T), \ldots, - {\varepsilon}_M / (k T))


Generalized Boltzmann distribution

Distribution of the form :\Pr\left(\omega\right)\propto\exp\left sum_{\eta=1}^{n}\frac{X_{\eta}x_{\eta}^{\left(\omega\right){k_{B}T}-\frac{E^{\left(\omega\right){k_{B}T}\right/math> is called generalized Boltzmann distribution by some authors. The Boltzmann distribution is a special case of the generalized Boltzmann distribution. The generalized Boltzmann distribution is used in statistical mechanics to describe
canonical ensemble In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the heat ...
,
grand canonical ensemble In statistical mechanics, the grand canonical ensemble (also known as the macrocanonical ensemble) is the statistical ensemble that is used to represent the possible states of a mechanical system of particles that are in thermodynamic equilibriu ...
and
isothermal–isobaric ensemble The isothermal–isobaric ensemble (constant temperature and constant pressure ensemble) is a statistical mechanical ensemble that maintains constant temperature T \, and constant pressure P \, applied. It is also called the NpT-ensemble, where ...
. The generalized Boltzmann distribution is usually derived from
principle of maximum entropy The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precisely stated prior data (such as a proposition ...
, but there are other derivations. The generalized Boltzmann distribution has the following properties: * It is the only distribution for which the entropy as defined by
Gibbs entropy formula The concept entropy was first developed by German physicist Rudolf Clausius in the mid-nineteenth century as a thermodynamic property that predicts that certain spontaneous processes are irreversible or impossible. In statistical mechanics, entropy ...
matches with the entropy as defined in
classical thermodynamics Thermodynamics is a branch of physics that deals with heat, work, and temperature, and their relation to energy, entropy, and the physical properties of matter and radiation. The behavior of these quantities is governed by the four laws of ther ...
. * It is the only distribution that is mathematically consistent with the
fundamental thermodynamic relation In thermodynamics, the fundamental thermodynamic relation are four fundamental equations which demonstrate how four important thermodynamic quantities depend on variables that can be controlled and measured experimentally. Thus, they are essentiall ...
where state functions are described by ensemble average.


In statistical mechanics

The Boltzmann distribution appears in
statistical mechanics In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic be ...
when considering closed systems of fixed composition that are in
thermal equilibrium Two physical systems are in thermal equilibrium if there is no net flow of thermal energy between them when they are connected by a path permeable to heat. Thermal equilibrium obeys the zeroth law of thermodynamics. A system is said to be in ...
(equilibrium with respect to energy exchange). The most general case is the probability distribution for the canonical ensemble. Some special cases (derivable from the canonical ensemble) show the Boltzmann distribution in different aspects: ;
Canonical ensemble In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the heat ...
(general case) : The
canonical ensemble In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the heat ...
gives the probabilities of the various possible states of a closed system of fixed volume, in thermal equilibrium with a
heat bath In thermodynamics, heat is defined as the form of energy crossing the boundary of a thermodynamic system by virtue of a temperature difference across the boundary. A thermodynamic system does not ''contain'' heat. Nevertheless, the term is al ...
. The canonical ensemble has a state probability distribution with the Boltzmann form. ; Statistical frequencies of subsystems' states (in a non-interacting collection) : When the system of interest is a collection of many non-interacting copies of a smaller subsystem, it is sometimes useful to find the statistical frequency of a given subsystem state, among the collection. The canonical ensemble has the property of separability when applied to such a collection: as long as the non-interacting subsystems have fixed composition, then each subsystem's state is independent of the others and is also characterized by a canonical ensemble. As a result, the expected statistical frequency distribution of subsystem states has the Boltzmann form. ;
Maxwell–Boltzmann statistics In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of Classical physics, classical material particles over various energy states in thermal equilibrium. It is applicable when the temperature is high enough or the ...
of classical gases (systems of non-interacting particles) : In particle systems, many particles share the same space and regularly change places with each other; the single-particle state space they occupy is a shared space.
Maxwell–Boltzmann statistics In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of Classical physics, classical material particles over various energy states in thermal equilibrium. It is applicable when the temperature is high enough or the ...
give the expected number of particles found in a given single-particle state, in a classical gas of non-interacting particles at equilibrium. This expected number distribution has the Boltzmann form. Although these cases have strong similarities, it is helpful to distinguish them as they generalize in different ways when the crucial assumptions are changed: * When a system is in thermodynamic equilibrium with respect to both energy exchange ''and particle exchange'', the requirement of fixed composition is relaxed and a
grand canonical ensemble In statistical mechanics, the grand canonical ensemble (also known as the macrocanonical ensemble) is the statistical ensemble that is used to represent the possible states of a mechanical system of particles that are in thermodynamic equilibriu ...
is obtained rather than canonical ensemble. On the other hand, if both composition and energy are fixed, then a
microcanonical ensemble In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it canno ...
applies instead. * If the subsystems within a collection ''do'' interact with each other, then the expected frequencies of subsystem states no longer follow a Boltzmann distribution, and even may not have an analytical solution. The canonical ensemble can however still be applied to the ''collective'' states of the entire system considered as a whole, provided the entire system is in thermal equilibrium. * With ''
quantum In physics, a quantum (plural quanta) is the minimum amount of any physical entity (physical property) involved in an interaction. The fundamental notion that a physical property can be "quantized" is referred to as "the hypothesis of quantizati ...
'' gases of non-interacting particles in equilibrium, the number of particles found in a given single-particle state does not follow Maxwell–Boltzmann statistics, and there is no simple closed form expression for quantum gases in the canonical ensemble. In the grand canonical ensemble the state-filling statistics of quantum gases are described by Fermi–Dirac statistics or
Bose–Einstein statistics In quantum statistics, Bose–Einstein statistics (B–E statistics) describes one of two possible ways in which a collection of non-interacting, indistinguishable particles may occupy a set of available discrete energy states at thermodynamic ...
, depending on whether the particles are
fermion In particle physics, a fermion is a particle that follows Fermi–Dirac statistics. Generally, it has a half-odd-integer spin: spin , spin , etc. In addition, these particles obey the Pauli exclusion principle. Fermions include all quarks an ...
s or
boson In particle physics, a boson ( ) is a subatomic particle whose spin quantum number has an integer value (0,1,2 ...). Bosons form one of the two fundamental classes of subatomic particle, the other being fermions, which have odd half-integer s ...
s, respectively.


In mathematics

In more general mathematical settings, the Boltzmann distribution is also known as the
Gibbs measure In mathematics, the Gibbs measure, named after Josiah Willard Gibbs, is a probability measure frequently seen in many problems of probability theory and statistical mechanics. It is a generalization of the canonical ensemble to infinite systems. Th ...
. In statistics and
machine learning Machine learning (ML) is a field of inquiry devoted to understanding and building methods that 'learn', that is, methods that leverage data to improve performance on some set of tasks. It is seen as a part of artificial intelligence. Machine ...
, it is called a
log-linear model A log-linear model is a mathematical model that takes the form of a function whose logarithm equals a linear combination of the parameters of the model, which makes it possible to apply (possibly multivariate) linear regression. That is, it has ...
. In deep learning, the Boltzmann distribution is used in the
sampling distribution In statistics, a sampling distribution or finite-sample distribution is the probability distribution of a given random-sample-based statistic. If an arbitrarily large number of samples, each involving multiple observations (data points), were s ...
of stochastic neural networks such as the
Boltzmann machine A Boltzmann machine (also called Sherrington–Kirkpatrick model with external field or stochastic Ising–Lenz–Little model) is a stochastic spin-glass model with an external field, i.e., a Sherrington–Kirkpatrick model, that is a stochastic ...
,
restricted Boltzmann machine A restricted Boltzmann machine (RBM) is a generative stochastic artificial neural network that can learn a probability distribution over its set of inputs. RBMs were initially invented under the name Harmonium by Paul Smolensky in 1986, and rose ...
, energy-based models and
deep Boltzmann machine A Boltzmann machine (also called Sherrington–Kirkpatrick model with external field or stochastic Ising–Lenz–Little model) is a stochastic spin-glass model with an external field, i.e., a Sherrington–Kirkpatrick model, that is a stochastic ...
. In deep learning, the
Boltzmann machine A Boltzmann machine (also called Sherrington–Kirkpatrick model with external field or stochastic Ising–Lenz–Little model) is a stochastic spin-glass model with an external field, i.e., a Sherrington–Kirkpatrick model, that is a stochastic ...
is considered to be one of the
unsupervised learning Unsupervised learning is a type of algorithm that learns patterns from untagged data. The hope is that through mimicry, which is an important mode of learning in people, the machine is forced to build a concise representation of its world and t ...
models. In the design of
Boltzmann machine A Boltzmann machine (also called Sherrington–Kirkpatrick model with external field or stochastic Ising–Lenz–Little model) is a stochastic spin-glass model with an external field, i.e., a Sherrington–Kirkpatrick model, that is a stochastic ...
in deep learning , as the number of nodes are increased the difficulty of implementing in real time applications becomes critical, so a different type of architecture named
Restricted Boltzmann machine A restricted Boltzmann machine (RBM) is a generative stochastic artificial neural network that can learn a probability distribution over its set of inputs. RBMs were initially invented under the name Harmonium by Paul Smolensky in 1986, and rose ...
is introduced.


In economics

The Boltzmann distribution can be introduced to allocate permits in
emissions trading Emissions trading is a market-based approach to controlling pollution by providing economic incentives for reducing the emissions of pollutants. The concept is also known as cap and trade (CAT) or emissions trading scheme (ETS). Carbon emission t ...
.Park, J.-W., Kim, C. U. and Isard, W. (2012) Permit allocation in emissions trading using the Boltzmann distribution. Physica A 391: 4883–4890 The new allocation method using the Boltzmann distribution can describe the most probable, natural, and unbiased distribution of emissions permits among multiple countries. The Boltzmann distribution has the same form as the
multinomial logit In statistics, multinomial logistic regression is a classification method that generalizes logistic regression to multiclass problems, i.e. with more than two possible discrete outcomes. That is, it is a model that is used to predict the prob ...
model. As a
discrete choice In economics, discrete choice models, or qualitative choice models, describe, explain, and predict choices between two or more discrete alternatives, such as entering or not entering the labor market, or choosing between modes of transport. Such ...
model, this is very well known in economics since
Daniel McFadden Daniel Little McFadden (born July 29, 1937) is an American econometrician who shared the 2000 Nobel Memorial Prize in Economic Sciences with James Heckman. McFadden's share of the prize was "for his development of theory and methods for analyzi ...
made the connection to random utility maximization.


See also

*
Bose–Einstein statistics In quantum statistics, Bose–Einstein statistics (B–E statistics) describes one of two possible ways in which a collection of non-interacting, indistinguishable particles may occupy a set of available discrete energy states at thermodynamic ...
* Fermi–Dirac statistics *
Negative temperature Certain systems can achieve negative thermodynamic temperature; that is, their temperature can be expressed as a negative quantity on the Kelvin or Rankine scales. This should be distinguished from temperatures expressed as negative numbers ...
*
Softmax function The softmax function, also known as softargmax or normalized exponential function, converts a vector of real numbers into a probability distribution of possible outcomes. It is a generalization of the logistic function to multiple dimensions, a ...


References

{{Probability distributions Statistical mechanics
Distribution Distribution may refer to: Mathematics *Distribution (mathematics), generalized functions used to formulate solutions of partial differential equations * Probability distribution, the probability of a particular value or value range of a vari ...