Maxwell–Boltzmann statistics
   HOME

TheInfoList



OR:

In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of classical material particles over various energy states in
thermal equilibrium Two physical systems are in thermal equilibrium if there is no net flow of thermal energy between them when they are connected by a path permeable to heat. Thermal equilibrium obeys the zeroth law of thermodynamics. A system is said to be i ...
. It is applicable when the temperature is high enough or the particle density is low enough to render quantum effects negligible. The expected
number of particles The particle number (or number of particles) of a thermodynamic system, conventionally indicated with the letter ''N'', is the number of constituent particles in that system. The particle number is a fundamental parameter in thermodynamics which is ...
with energy \varepsilon_i for Maxwell–Boltzmann statistics is :\langle N_i \rangle = \frac = \frac\,g_i e^, where: *\varepsilon_i is the energy of the ''i''-th
energy In physics, energy (from Ancient Greek: ἐνέργεια, ''enérgeia'', “activity”) is the quantitative property that is transferred to a body or to a physical system, recognizable in the performance of work and in the form of hea ...
level, *\langle N_i \rangle is the average number of particles in the set of states with energy \varepsilon_i, *g_i is the degeneracy of energy level ''i'', that is, the number of states with energy \varepsilon_i which may nevertheless be distinguished from each other by some other means,For example, two simple point particles may have the same energy, but different momentum vectors. They may be distinguished from each other on this basis, and the degeneracy will be the number of possible ways that they can be so distinguished. *μ is the chemical potential, *''k'' is the
Boltzmann constant The Boltzmann constant ( or ) is the proportionality factor that relates the average relative kinetic energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constant, ...
, *''T'' is absolute
temperature Temperature is a physical quantity that expresses quantitatively the perceptions of hotness and coldness. Temperature is measurement, measured with a thermometer. Thermometers are calibrated in various Conversion of units of temperature, temp ...
, *''N'' is the total number of particles: N = \sum_i N_i, *''Z'' is the partition function: Z = \sum_i g_i e^, *''e'' is
Euler's number The number , also known as Euler's number, is a mathematical constant approximately equal to 2.71828 that can be characterized in many ways. It is the base of the natural logarithms. It is the limit of as approaches infinity, an expressi ...
Equivalently, the number of particles is sometimes expressed as :\langle N_i \rangle = \frac = \frac\,e^, where the index ''i'' now specifies a particular state rather than the set of all states with energy \varepsilon_i, and Z = \sum_i e^.


History

Maxwell–Boltzmann statistics grew out of the Maxwell–Boltzmann distribution, most likely as a distillation of the underlying technique. The distribution was first derived by Maxwell in 1860 on heuristic grounds. Boltzmann later, in the 1870s, carried out significant investigations into the physical origins of this distribution. The distribution can be derived on the ground that it maximizes the entropy of the system.


Applicability

Maxwell–Boltzmann statistics is used to derive the Maxwell–Boltzmann distribution of an ideal gas. However, it can also be used to extend that distribution to particles with a different energy–momentum relation, such as relativistic particles (resulting in Maxwell–Jüttner distribution), and to other than three-dimensional spaces. Maxwell–Boltzmann statistics is often described as the statistics of "distinguishable" classical particles. In other words, the configuration of particle ''A'' in state 1 and particle ''B'' in state 2 is different from the case in which particle ''B'' is in state 1 and particle ''A'' is in state 2. This assumption leads to the proper (Boltzmann) statistics of particles in the energy states, but yields non-physical results for the entropy, as embodied in the Gibbs paradox. At the same time, there are no real particles that have the characteristics required by Maxwell–Boltzmann statistics. Indeed, the Gibbs paradox is resolved if we treat all particles of a certain type (e.g., electrons, protons, etc.) as principally indistinguishable. Once this assumption is made, the particle statistics change. The change in entropy in the entropy of mixing example may be viewed as an example of a non-extensive entropy resulting from the distinguishability of the two types of particles being mixed. Quantum particles are either bosons (following instead
Bose–Einstein statistics In quantum statistics, Bose–Einstein statistics (B–E statistics) describes one of two possible ways in which a collection of non-interacting, indistinguishable particles may occupy a set of available discrete energy states at thermodynamic ...
) or fermions (subject to the
Pauli exclusion principle In quantum mechanics, the Pauli exclusion principle states that two or more identical particles with half-integer spins (i.e. fermions) cannot occupy the same quantum state within a quantum system simultaneously. This principle was formulat ...
, following instead Fermi–Dirac statistics). Both of these quantum statistics approach the Maxwell–Boltzmann statistics in the limit of high temperature and low particle density.


Derivations

Maxwell–Boltzmann statistics can be derived in various statistical mechanical thermodynamic ensembles: * The
grand canonical ensemble In statistical mechanics, the grand canonical ensemble (also known as the macrocanonical ensemble) is the statistical ensemble that is used to represent the possible states of a mechanical system of particles that are in thermodynamic equilibriu ...
, exactly. * The
canonical ensemble In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the heat ...
, but only in the thermodynamic limit. * The
microcanonical ensemble In statistical mechanics, the microcanonical ensemble is a statistical ensemble that represents the possible states of a mechanical system whose total energy is exactly specified. The system is assumed to be isolated in the sense that it canno ...
, exactly In each case it is necessary to assume that the particles are non-interacting, and that multiple particles can occupy the same state and do so independently.


Derivation from microcanonical ensemble

Suppose we have a container with a huge number of very small particles all with identical physical characteristics (such as mass, charge, etc.). Let's refer to this as the ''system''. Assume that though the particles have identical properties, they are distinguishable. For example, we might identify each particle by continually observing their trajectories, or by placing a marking on each one, e.g., drawing a different number on each one as is done with lottery balls. The particles are moving inside that container in all directions with great speed. Because the particles are speeding around, they possess some energy. The Maxwell–Boltzmann distribution is a mathematical function that describes about how many particles in the container have a certain energy. More precisely, the Maxwell–Boltzmann distribution gives the non-normalized probability (this means that the probabilities do not add up to 1) that the state corresponding to a particular energy is occupied. In general, there may be many particles with the same amount of energy \varepsilon. Let the number of particles with the same energy \varepsilon_1 be N_1, the number of particles possessing another energy \varepsilon_2 be N_2, and so forth for all the possible energies \. To describe this situation, we say that N_i is the ''occupation number'' of the ''energy level'' i. If we know all the occupation numbers \, then we know the total energy of the system. However, because we can distinguish between ''which'' particles are occupying each energy level, the set of occupation numbers \ does not completely describe the state of the system. To completely describe the state of the system, or the ''microstate'', we must specify exactly which particles are in each energy level. Thus when we count the number of possible states of the system, we must count each and every microstate, and not just the possible sets of occupation numbers. To begin with, assume that there is only one way to put N_i particles into the energy level i (there is no degeneracy). What follows next is a bit of combinatorial thinking which has little to do in accurately describing the reservoir of particles. For instance, let's say there is a total of k boxes labelled a,b,\ldots,k. With the concept of
combination In mathematics, a combination is a selection of items from a set that has distinct members, such that the order of selection does not matter (unlike permutations). For example, given three fruits, say an apple, an orange and a pear, there are th ...
, we could calculate how many ways to arrange N balls into respective ''l''-th box in which there would be N_l balls without an order. To begin with, we select N_a balls from a total of N balls, placing them in box a, and continuing on selection from the remaining until no ball is left outside. The total number of ways that the balls can be arranged is : \begin W & = \frac \times \frac \times \frac \times \cdots \times \frac \\ pt& = \frac \end and because not even a single ball is to be left outside the boxes (all balls should be put in boxes), which implies that the sum made of the terms N_a, N_b, \ldots, N_k must equal to N; thus the term (N - N_a - N_b - \cdots - N_k)! in the relation above evaluates to 0! (0! = 1), and we simplify the relation as : W = N!\prod_^k \frac This is just the
multinomial coefficient In mathematics, the multinomial theorem describes how to expand a power of a sum in terms of powers of the terms in that sum. It is the generalization of the binomial theorem from binomials to multinomials. Theorem For any positive integer ...
, the number of ways of arranging ''N'' items into ''k'' boxes, the ''l''-th box holding ''Nl'' items, ignoring the permutation of items in each box. Now, consider the case where there is more than one way to put N_i particles in the box i (i.e. taking the degeneracy problem into consideration). If the i -th box has a "degeneracy" of g_i, that is, it has g_i "sub-boxes" ( g_i boxes with the same energy \varepsilon_i . These states/boxes with the same energy are called degenerate states.), such that any way of filling the i -th box where the number in the sub-boxes is changed is a distinct way of filling the box, then the number of ways of filling the ''i''-th box must be increased by the number of ways of distributing the N_i objects in the g_i "sub-boxes". The number of ways of placing N_i distinguishable objects in g_i "sub-boxes" is g_i^ (the first object can go into any of the g_i boxes, the second object can also go into any of the g_i boxes, and so on). Thus the number of ways W that a total of N particles can be classified into energy levels according to their energies, while each level i having g_i distinct states such that the ''i''-th level accommodates N_i particles is: :W=N!\prod_\frac This is the form for ''W'' first derived by
Boltzmann Ludwig Eduard Boltzmann (; 20 February 1844 – 5 September 1906) was an Austrian physicist and philosopher. His greatest achievements were the development of statistical mechanics, and the statistical explanation of the second law of thermodyn ...
. Boltzmann's fundamental equation S=k\,\ln W relates the thermodynamic
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
''S'' to the number of microstates ''W'', where ''k'' is the
Boltzmann constant The Boltzmann constant ( or ) is the proportionality factor that relates the average relative kinetic energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constant, ...
. It was pointed out by Gibbs however, that the above expression for ''W'' does not yield an extensive entropy, and is therefore faulty. This problem is known as the Gibbs paradox. The problem is that the particles considered by the above equation are not indistinguishable. In other words, for two particles (''A'' and ''B'') in two energy sublevels the population represented by ,Bis considered distinct from the population ,Awhile for indistinguishable particles, they are not. If we carry out the argument for indistinguishable particles, we are led to the Bose–Einstein expression for ''W'': :W=\prod_i \frac The Maxwell–Boltzmann distribution follows from this Bose–Einstein distribution for temperatures well above absolute zero, implying that g_i\gg 1. The Maxwell–Boltzmann distribution also requires low density, implying that g_i\gg N_i. Under these conditions, we may use
Stirling's approximation In mathematics, Stirling's approximation (or Stirling's formula) is an approximation for factorials. It is a good approximation, leading to accurate results even for small values of n. It is named after James Stirling, though a related but less p ...
for the factorial: :N! \approx N^N e^, to write: :W\approx\prod_i \frac\approx\prod_i \frac Using the fact that (1+N_i/g_i)^\approx e^ for g_i\gg N_i we can again use Stirling's approximation to write: :W\approx\prod_i \frac This is essentially a division by ''N''! of Boltzmann's original expression for ''W'', and this correction is referred to as . We wish to find the N_i for which the function W is maximized, while considering the constraint that there is a fixed number of particles \left(N=\sum N_i\right) and a fixed energy \left(E=\sum N_i \varepsilon_i\right) in the container. The maxima of W and \ln(W) are achieved by the same values of N_i and, since it is easier to accomplish mathematically, we will maximize the latter function instead. We constrain our solution using
Lagrange multipliers In mathematical optimization, the method of Lagrange multipliers is a strategy for finding the local maxima and minima of a function subject to equality constraints (i.e., subject to the condition that one or more equations have to be satisfied e ...
forming the function: :f(N_1,N_2,\ldots,N_n) = \textstyle \ln(W)+\alpha(N-\sum N_i) + \beta(E-\sum N_i \varepsilon_i) :\ln W=\ln\left prod_^\frac\right\approx \sum_^n\left(N_i\ln g_i-N_i\ln N_i + N_i\right) Finally :f(N_1,N_2,\ldots,N_n)=\alpha N +\beta E + \sum_^n\left(N_i\ln g_i-N_i\ln N_i + N_i-(\alpha+\beta\varepsilon_i) N_i\right) In order to maximize the expression above we apply Fermat's theorem (stationary points), according to which local extrema, if exist, must be at critical points (partial derivatives vanish): :\frac=\ln g_i-\ln N_i -(\alpha+\beta\varepsilon_i) = 0 By solving the equations above (i=1\ldots n) we arrive to an expression for N_i: :N_i = \frac Substituting this expression for N_i into the equation for \ln W and assuming that N \gg 1 yields: :\ln W = (\alpha+1) N+\beta E\, or, rearranging: :E=\frac-\frac-\frac Boltzmann realized that this is just an expression of the Euler-integrated fundamental equation of thermodynamics. Identifying ''E'' as the internal energy, the Euler-integrated fundamental equation states that : :E=TS-PV+\mu N where ''T'' is the
temperature Temperature is a physical quantity that expresses quantitatively the perceptions of hotness and coldness. Temperature is measurement, measured with a thermometer. Thermometers are calibrated in various Conversion of units of temperature, temp ...
, ''P'' is pressure, ''V'' is
volume Volume is a measure of occupied three-dimensional space. It is often quantified numerically using SI derived units (such as the cubic metre and litre) or by various imperial or US customary units (such as the gallon, quart, cubic inch). Th ...
, and μ is the chemical potential. Boltzmann's famous equation S=k \ln W is the realization that the entropy is proportional to \ln W with the constant of proportionality being the
Boltzmann constant The Boltzmann constant ( or ) is the proportionality factor that relates the average relative kinetic energy of particles in a gas with the thermodynamic temperature of the gas. It occurs in the definitions of the kelvin and the gas constant, ...
. Using the ideal gas equation of state (''PV'' = ''NkT''), It follows immediately that \beta=1/kT and \alpha=-\mu/kT so that the populations may now be written: :N_i = \frac Note that the above formula is sometimes written: :N_i = \frac where z=\exp(\mu/kT) is the absolute activity. Alternatively, we may use the fact that :\sum_i N_i=N to obtain the population numbers as :N_i = N\frac where ''Z'' is the partition function defined by: :Z = \sum_i g_i e^ In an approximation where ''εi'' is considered to be a continuous variable, the Thomas–Fermi approximation yields a continuous degeneracy g proportional to \sqrt so that: : \frac = \frac = \frac which is just the Maxwell–Boltzmann distribution for the energy.


Derivation from canonical ensemble

In the above discussion, the Boltzmann distribution function was obtained via directly analysing the multiplicities of a system. Alternatively, one can make use of the
canonical ensemble In statistical mechanics, a canonical ensemble is the statistical ensemble that represents the possible states of a mechanical system in thermal equilibrium with a heat bath at a fixed temperature. The system can exchange energy with the heat ...
. In a canonical ensemble, a system is in thermal contact with a reservoir. While energy is free to flow between the system and the reservoir, the reservoir is thought to have infinitely large heat capacity as to maintain constant temperature, ''T'', for the combined system. In the present context, our system is assumed to have the energy levels \varepsilon _i with degeneracies g_i. As before, we would like to calculate the probability that our system has energy \varepsilon_i. If our system is in state \; s_1, then there would be a corresponding number of microstates available to the reservoir. Call this number \; \Omega _ R (s_1). By assumption, the combined system (of the system we are interested in and the reservoir) is isolated, so all microstates are equally probable. Therefore, for instance, if \; \Omega _ R (s_1) = 2 \; \Omega _ R (s_2) , we can conclude that our system is twice as likely to be in state \; s_1 than \; s_2. In general, if \; P(s_i) is the probability that our system is in state \; s_i, :\frac = \frac. Since the
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynam ...
of the reservoir \; S_R = k \ln \Omega _R, the above becomes :\frac = \frac = e^. Next we recall the thermodynamic identity (from the
first law of thermodynamics The first law of thermodynamics is a formulation of the law of conservation of energy, adapted for thermodynamic processes. It distinguishes in principle two forms of energy transfer, heat and thermodynamic work for a system of a constant amou ...
): :d S_R = \frac (d U_R + P \, d V_R - \mu \, d N_R). In a canonical ensemble, there is no exchange of particles, so the d N_R term is zero. Similarly, d V_R = 0. This gives : S_R (s_1) - S_R (s_2) = \frac (U_R (s_1) - U_R (s_2)) = - \frac (E(s_1) - E(s_2)), where U_R (s_i) and E(s_i) denote the energies of the reservoir and the system at s_i, respectively. For the second equality we have used the conservation of energy. Substituting into the first equation relating P(s_1), \; P(s_2): : \frac = \frac, which implies, for any state ''s'' of the system : P(s) = \frac e^, where ''Z'' is an appropriately chosen "constant" to make total probability 1. (''Z'' is constant provided that the temperature ''T'' is invariant.) : Z = \sum _s e^, where the index ''s'' runs through all microstates of the system. ''Z'' is sometimes called the Boltzmann sum over states (or "Zustandssumme" in the original German). If we index the summation via the energy eigenvalues instead of all possible states, degeneracy must be taken into account. The probability of our system having energy \varepsilon _i is simply the sum of the probabilities of all corresponding microstates: :P (\varepsilon _i) = \frac g_i e^ where, with obvious modification, :Z = \sum _j g_j e^, this is the same result as before. Comments on this derivation: *Notice that in this formulation, the initial assumption "... ''suppose the system has total ''N'' particles''..." is dispensed with. Indeed, the number of particles possessed by the system plays no role in arriving at the distribution. Rather, how many particles would occupy states with energy \varepsilon _i follows as an easy consequence. *What has been presented above is essentially a derivation of the canonical partition function. As one can see by comparing the definitions, the Boltzmann sum over states is equal to the canonical partition function. *Exactly the same approach can be used to derive Fermi–Dirac and Bose–Einstein statistics. However, there one would replace the canonical ensemble with the
grand canonical ensemble In statistical mechanics, the grand canonical ensemble (also known as the macrocanonical ensemble) is the statistical ensemble that is used to represent the possible states of a mechanical system of particles that are in thermodynamic equilibriu ...
, since there is exchange of particles between the system and the reservoir. Also, the system one considers in those cases is a single particle ''state'', not a particle. (In the above discussion, we could have assumed our system to be a single atom.)


See also

*
Bose–Einstein statistics In quantum statistics, Bose–Einstein statistics (B–E statistics) describes one of two possible ways in which a collection of non-interacting, indistinguishable particles may occupy a set of available discrete energy states at thermodynamic ...
* Fermi–Dirac statistics *
Boltzmann factor Factor, a Latin word meaning "who/which acts", may refer to: Commerce * Factor (agent), a person who acts for, notably a mercantile and colonial agent * Factor (Scotland), a person or firm managing a Scottish estate * Factors of production, suc ...


Notes


References


Bibliography

*Carter, Ashley H., "Classical and Statistical Thermodynamics", Prentice–Hall, Inc., 2001, New Jersey. * Raj Pathria, "Statistical Mechanics", Butterworth–Heinemann, 1996. {{DEFAULTSORT:Maxwell-Boltzmann Statistics Concepts in physics James Clerk Maxwell Ludwig Boltzmann