HOME

TheInfoList



OR:

In
physics Physics is the scientific study of matter, its Elementary particle, fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge whi ...
, statistical mechanics is a mathematical framework that applies statistical methods and
probability theory Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
to large assemblies of microscopic entities. Sometimes called statistical physics or statistical thermodynamics, its applications include many problems in a wide variety of fields such as
biology Biology is the scientific study of life and living organisms. It is a broad natural science that encompasses a wide range of fields and unifying principles that explain the structure, function, growth, History of life, origin, evolution, and ...
,
neuroscience Neuroscience is the scientific study of the nervous system (the brain, spinal cord, and peripheral nervous system), its functions, and its disorders. It is a multidisciplinary science that combines physiology, anatomy, molecular biology, ...
,
computer science Computer science is the study of computation, information, and automation. Computer science spans Theoretical computer science, theoretical disciplines (such as algorithms, theory of computation, and information theory) to Applied science, ...
,
information theory Information theory is the mathematical study of the quantification (science), quantification, Data storage, storage, and telecommunications, communication of information. The field was established and formalized by Claude Shannon in the 1940s, ...
and
sociology Sociology is the scientific study of human society that focuses on society, human social behavior, patterns of Interpersonal ties, social relationships, social interaction, and aspects of culture associated with everyday life. The term sociol ...
. Its main purpose is to clarify the properties of matter in aggregate, in terms of physical laws governing atomic motion. Statistical mechanics arose out of the development of classical thermodynamics, a field for which it was successful in explaining macroscopic physical properties—such as
temperature Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measurement, measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making ...
,
pressure Pressure (symbol: ''p'' or ''P'') is the force applied perpendicular to the surface of an object per unit area over which that force is distributed. Gauge pressure (also spelled ''gage'' pressure)The preferred spelling varies by country and eve ...
, and heat capacity—in terms of microscopic parameters that fluctuate about average values and are characterized by
probability distribution In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
s. While classical thermodynamics is primarily concerned with thermodynamic equilibrium, statistical mechanics has been applied in non-equilibrium statistical mechanics to the issues of microscopically modeling the speed of
irreversible process In thermodynamics, an irreversible process is a thermodynamic processes, process that cannot be undone. All complex natural processes are irreversible, although a phase transition at the coexistence temperature (e.g. melting of ice cubes in wate ...
es that are driven by imbalances. Examples of such processes include
chemical reaction A chemical reaction is a process that leads to the chemistry, chemical transformation of one set of chemical substances to another. When chemical reactions occur, the atoms are rearranged and the reaction is accompanied by an Gibbs free energy, ...
s and flows of particles and heat. The fluctuation–dissipation theorem is the basic knowledge obtained from applying non-equilibrium statistical mechanics to study the simplest non-equilibrium situation of a steady state current flow in a system of many particles.


History

In 1738, Swiss physicist and mathematician Daniel Bernoulli published ''Hydrodynamica'' which laid the basis for the kinetic theory of gases. In this work, Bernoulli posited the argument, still used to this day, that gases consist of great numbers of molecules moving in all directions, that their impact on a surface causes the gas pressure that we feel, and that what we experience as
heat In thermodynamics, heat is energy in transfer between a thermodynamic system and its surroundings by such mechanisms as thermal conduction, electromagnetic radiation, and friction, which are microscopic in nature, involving sub-atomic, ato ...
is simply the kinetic energy of their motion. The founding of the field of statistical mechanics is generally credited to three physicists: *
Ludwig Boltzmann Ludwig Eduard Boltzmann ( ; ; 20 February 1844 – 5 September 1906) was an Austrian mathematician and Theoretical physics, theoretical physicist. His greatest achievements were the development of statistical mechanics and the statistical ex ...
, who developed the fundamental interpretation of
entropy Entropy is a scientific concept, most commonly associated with states of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the micros ...
in terms of a collection of microstates *
James Clerk Maxwell James Clerk Maxwell (13 June 1831 – 5 November 1879) was a Scottish physicist and mathematician who was responsible for the classical theory of electromagnetic radiation, which was the first theory to describe electricity, magnetism an ...
, who developed models of probability distribution of such states * Josiah Willard Gibbs, who coined the name of the field in 1884 In 1859, after reading a paper on the diffusion of molecules by Rudolf Clausius, Scottish physicist
James Clerk Maxwell James Clerk Maxwell (13 June 1831 – 5 November 1879) was a Scottish physicist and mathematician who was responsible for the classical theory of electromagnetic radiation, which was the first theory to describe electricity, magnetism an ...
formulated the Maxwell distribution of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range. This was the first-ever statistical law in physics. Maxwell also gave the first mechanical argument that molecular collisions entail an equalization of temperatures and hence a tendency towards equilibrium. Five years later, in 1864,
Ludwig Boltzmann Ludwig Eduard Boltzmann ( ; ; 20 February 1844 – 5 September 1906) was an Austrian mathematician and Theoretical physics, theoretical physicist. His greatest achievements were the development of statistical mechanics and the statistical ex ...
, a young student in Vienna, came across Maxwell's paper and spent much of his life developing the subject further. Statistical mechanics was initiated in the 1870s with the work of Boltzmann, much of which was collectively published in his 1896 ''Lectures on Gas Theory''. Boltzmann's original papers on the statistical interpretation of thermodynamics, the H-theorem, transport theory, thermal equilibrium, the equation of state of gases, and similar subjects, occupy about 2,000 pages in the proceedings of the Vienna Academy and other societies. Boltzmann introduced the concept of an equilibrium statistical ensemble and also investigated for the first time non-equilibrium statistical mechanics, with his ''H''-theorem. The term "statistical mechanics" was coined by the American mathematical physicist J. Willard Gibbs in 1884. According to Gibbs, the term "statistical", in the context of mechanics, i.e. statistical mechanics, was first used by the Scottish physicist
James Clerk Maxwell James Clerk Maxwell (13 June 1831 – 5 November 1879) was a Scottish physicist and mathematician who was responsible for the classical theory of electromagnetic radiation, which was the first theory to describe electricity, magnetism an ...
in 1871: "Probabilistic mechanics" might today seem a more appropriate term, but "statistical mechanics" is firmly entrenched. Shortly before his death, Gibbs published in 1902 '' Elementary Principles in Statistical Mechanics'', a book which formalized statistical mechanics as a fully general approach to address all mechanical systems—macroscopic or microscopic, gaseous or non-gaseous. Gibbs' methods were initially derived in the framework
classical mechanics Classical mechanics is a Theoretical physics, physical theory describing the motion of objects such as projectiles, parts of Machine (mechanical), machinery, spacecraft, planets, stars, and galaxies. The development of classical mechanics inv ...
, however they were of such generality that they were found to adapt easily to the later
quantum mechanics Quantum mechanics is the fundamental physical Scientific theory, theory that describes the behavior of matter and of light; its unusual characteristics typically occur at and below the scale of atoms. Reprinted, Addison-Wesley, 1989, It is ...
, and still form the foundation of statistical mechanics to this day.


Principles: mechanics and ensembles

In physics, two types of mechanics are usually examined:
classical mechanics Classical mechanics is a Theoretical physics, physical theory describing the motion of objects such as projectiles, parts of Machine (mechanical), machinery, spacecraft, planets, stars, and galaxies. The development of classical mechanics inv ...
and
quantum mechanics Quantum mechanics is the fundamental physical Scientific theory, theory that describes the behavior of matter and of light; its unusual characteristics typically occur at and below the scale of atoms. Reprinted, Addison-Wesley, 1989, It is ...
. For both types of mechanics, the standard mathematical approach is to consider two concepts: *The complete state of the mechanical system at a given time, mathematically encoded as a phase point (classical mechanics) or a pure quantum state vector (quantum mechanics). *An equation of motion which carries the state forward in time: Hamilton's equations (classical mechanics) or the
Schrödinger equation The Schrödinger equation is a partial differential equation that governs the wave function of a non-relativistic quantum-mechanical system. Its discovery was a significant landmark in the development of quantum mechanics. It is named after E ...
(quantum mechanics) Using these two concepts, the state at any other time, past or future, can in principle be calculated. There is however a disconnect between these laws and everyday life experiences, as we do not find it necessary (nor even theoretically possible) to know exactly at a microscopic level the simultaneous positions and velocities of each molecule while carrying out processes at the human scale (for example, when performing a chemical reaction). Statistical mechanics fills this disconnection between the laws of mechanics and the practical experience of incomplete knowledge, by adding some uncertainty about which state the system is in. Whereas ordinary mechanics only considers the behaviour of a single state, statistical mechanics introduces the statistical ensemble, which is a large collection of virtual, independent copies of the system in various states. The statistical ensemble is a
probability distribution In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
over all possible states of the system. In classical statistical mechanics, the ensemble is a probability distribution over phase points (as opposed to a single phase point in ordinary mechanics), usually represented as a distribution in a phase space with canonical coordinate axes. In quantum statistical mechanics, the ensemble is a probability distribution over pure states and can be compactly summarized as a
density matrix In quantum mechanics, a density matrix (or density operator) is a matrix used in calculating the probabilities of the outcomes of measurements performed on physical systems. It is a generalization of the state vectors or wavefunctions: while th ...
. As is usual for probabilities, the ensemble can be interpreted in different ways: * an ensemble can be taken to represent the various possible states that a ''single system'' could be in ( epistemic probability, a form of knowledge), or * the members of the ensemble can be understood as the states of the systems in experiments repeated on independent systems which have been prepared in a similar but imperfectly controlled manner ( empirical probability), in the limit of an infinite number of trials. These two meanings are equivalent for many purposes, and will be used interchangeably in this article. However the probability is interpreted, each state in the ensemble evolves over time according to the equation of motion. Thus, the ensemble itself (the probability distribution over states) also evolves, as the virtual systems in the ensemble continually leave one state and enter another. The ensemble evolution is given by the Liouville equation (classical mechanics) or the von Neumann equation (quantum mechanics). These equations are simply derived by the application of the mechanical equation of motion separately to each virtual system contained in the ensemble, with the probability of the virtual system being conserved over time as it evolves from state to state. One special class of ensemble is those ensembles that do not evolve over time. These ensembles are known as ''equilibrium ensembles'' and their condition is known as ''statistical equilibrium''. Statistical equilibrium occurs if, for each state in the ensemble, the ensemble also contains all of its future and past states with probabilities equal to the probability of being in that state. (By contrast, '' mechanical equilibrium'' is a state with a balance of forces that has ceased to evolve.) The study of equilibrium ensembles of isolated systems is the focus of statistical thermodynamics. Non-equilibrium statistical mechanics addresses the more general case of ensembles that change over time, and/or ensembles of non-isolated systems.


Statistical thermodynamics

The primary goal of statistical thermodynamics (also known as equilibrium statistical mechanics) is to derive the classical thermodynamics of materials in terms of the properties of their constituent particles and the interactions between them. In other words, statistical thermodynamics provides a connection between the macroscopic properties of materials in thermodynamic equilibrium, and the microscopic behaviours and motions occurring inside the material. Whereas statistical mechanics proper involves dynamics, here the attention is focused on ''statistical equilibrium'' (steady state). Statistical equilibrium does not mean that the particles have stopped moving ( mechanical equilibrium), rather, only that the ensemble is not evolving.


Fundamental postulate

A sufficient (but not necessary) condition for statistical equilibrium with an isolated system is that the probability distribution is a function only of conserved properties (total energy, total particle numbers, etc.). There are many different equilibrium ensembles that can be considered, and only some of them correspond to thermodynamics. Additional postulates are necessary to motivate why the ensemble for a given system should have one form or another. A common approach found in many textbooks is to take the ''equal a priori probability postulate''. This postulate states that : ''For an isolated system with an exactly known energy and exactly known composition, the system can be found with ''equal probability'' in any microstate consistent with that knowledge.'' The equal a priori probability postulate therefore provides a motivation for the microcanonical ensemble described below. There are various arguments in favour of the equal a priori probability postulate: * Ergodic hypothesis: An ergodic system is one that evolves over time to explore "all accessible" states: all those with the same energy and composition. In an ergodic system, the microcanonical ensemble is the only possible equilibrium ensemble with fixed energy. This approach has limited applicability, since most systems are not ergodic. * Principle of indifference: In the absence of any further information, we can only assign equal probabilities to each compatible situation. * Maximum information entropy: A more elaborate version of the principle of indifference states that the correct ensemble is the ensemble that is compatible with the known information and that has the largest Gibbs entropy ( information entropy). Other fundamental postulates for statistical mechanics have also been proposed. For example, recent studies shows that the theory of statistical mechanics can be built without the equal a priori probability postulate. One such formalism is based on the fundamental thermodynamic relation together with the following set of postulates: where the third postulate can be replaced by the following:


Three thermodynamic ensembles

There are three equilibrium ensembles with a simple form that can be defined for any isolated system bounded inside a finite volume. These are the most often discussed ensembles in statistical thermodynamics. In the macroscopic limit (defined below) they all correspond to classical thermodynamics. ; Microcanonical ensemble : describes a system with a precisely given energy and fixed composition (precise number of particles). The microcanonical ensemble contains with equal probability each possible state that is consistent with that energy and composition. ; Canonical ensemble : describes a system of fixed composition that is in thermal equilibrium with a heat bath of a precise
temperature Temperature is a physical quantity that quantitatively expresses the attribute of hotness or coldness. Temperature is measurement, measured with a thermometer. It reflects the average kinetic energy of the vibrating and colliding atoms making ...
. The canonical ensemble contains states of varying energy but identical composition; the different states in the ensemble are accorded different probabilities depending on their total energy. ; Grand canonical ensemble : describes a system with non-fixed composition (uncertain particle numbers) that is in thermal and chemical equilibrium with a thermodynamic reservoir. The reservoir has a precise temperature, and precise
chemical potential In thermodynamics, the chemical potential of a Chemical specie, species is the energy that can be absorbed or released due to a change of the particle number of the given species, e.g. in a chemical reaction or phase transition. The chemical potent ...
s for various types of particle. The grand canonical ensemble contains states of varying energy and varying numbers of particles; the different states in the ensemble are accorded different probabilities depending on their total energy and total particle numbers. For systems containing many particles (the
thermodynamic limit In statistical mechanics, the thermodynamic limit or macroscopic limit, of a system is the Limit (mathematics), limit for a large number of particles (e.g., atoms or molecules) where the volume is taken to grow in proportion with the number of ...
), all three of the ensembles listed above tend to give identical behaviour. It is then simply a matter of mathematical convenience which ensemble is used. The Gibbs theorem about equivalence of ensembles was developed into the theory of concentration of measure phenomenon, which has applications in many areas of science, from functional analysis to methods of
artificial intelligence Artificial intelligence (AI) is the capability of computer, computational systems to perform tasks typically associated with human intelligence, such as learning, reasoning, problem-solving, perception, and decision-making. It is a field of re ...
and
big data Big data primarily refers to data sets that are too large or complex to be dealt with by traditional data processing, data-processing application software, software. Data with many entries (rows) offer greater statistical power, while data with ...
technology. Important cases where the thermodynamic ensembles ''do not'' give identical results include: * Microscopic systems. * Large systems at a phase transition. * Large systems with long-range interactions. In these cases the correct thermodynamic ensemble must be chosen as there are observable differences between these ensembles not just in the size of fluctuations, but also in average quantities such as the distribution of particles. The correct ensemble is that which corresponds to the way the system has been prepared and characterized—in other words, the ensemble that reflects the knowledge about that system.


Calculation methods

Once the characteristic state function for an ensemble has been calculated for a given system, that system is 'solved' (macroscopic observables can be extracted from the characteristic state function). Calculating the characteristic state function of a thermodynamic ensemble is not necessarily a simple task, however, since it involves considering every possible state of the system. While some hypothetical systems have been exactly solved, the most general (and realistic) case is too complex for an exact solution. Various approaches exist to approximate the true ensemble and allow calculation of average quantities.


Exact

There are some cases which allow exact solutions. * For very small microscopic systems, the ensembles can be directly computed by simply enumerating over all possible states of the system (using exact diagonalization in quantum mechanics, or integral over all phase space in classical mechanics). * Some large systems consist of many separable microscopic systems, and each of the subsystems can be analysed independently. Notably, idealized gases of non-interacting particles have this property, allowing exact derivations of
Maxwell–Boltzmann statistics In statistical mechanics, Maxwell–Boltzmann statistics describes the distribution of classical material particles over various energy states in thermal equilibrium. It is applicable when the temperature is high enough or the particle density ...
, Fermi–Dirac statistics, and Bose–Einstein statistics. * A few large systems with interaction have been solved. By the use of subtle mathematical techniques, exact solutions have been found for a few toy models. Some examples include the Bethe ansatz, square-lattice Ising model in zero field, hard hexagon model.


Monte Carlo

Although some problems in statistical physics can be solved analytically using approximations and expansions, most current research utilizes the large processing power of modern computers to simulate or approximate solutions. A common approach to statistical problems is to use a Monte Carlo simulation to yield insight into the properties of a
complex system A complex system is a system composed of many components that may interact with one another. Examples of complex systems are Earth's global climate, organisms, the human brain, infrastructure such as power grid, transportation or communication sy ...
. Monte Carlo methods are important in
computational physics Computational physics is the study and implementation of numerical analysis to solve problems in physics. Historically, computational physics was the first application of modern computers in science, and is now a subset of computational science ...
,
physical chemistry Physical chemistry is the study of macroscopic and microscopic phenomena in chemical systems in terms of the principles, practices, and concepts of physics such as motion, energy, force, time, thermodynamics, quantum chemistry, statistical mech ...
, and related fields, and have diverse applications including medical physics, where they are used to model radiation transport for radiation dosimetry calculations. The
Monte Carlo method Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be ...
examines just a few of the possible states of the system, with the states chosen randomly (with a fair weight). As long as these states form a representative sample of the whole set of states of the system, the approximate characteristic function is obtained. As more and more random samples are included, the errors are reduced to an arbitrarily low level. * The Metropolis–Hastings algorithm is a classic Monte Carlo method which was initially used to sample the canonical ensemble. * Path integral Monte Carlo, also used to sample the canonical ensemble.


Other

* For rarefied non-ideal gases, approaches such as the cluster expansion use perturbation theory to include the effect of weak interactions, leading to a virial expansion. * For dense fluids, another approximate approach is based on reduced distribution functions, in particular the radial distribution function. *
Molecular dynamics Molecular dynamics (MD) is a computer simulation method for analyzing the Motion (physics), physical movements of atoms and molecules. The atoms and molecules are allowed to interact for a fixed period of time, giving a view of the dynamics ( ...
computer simulations can be used to calculate microcanonical ensemble averages, in ergodic systems. With the inclusion of a connection to a stochastic heat bath, they can also model canonical and grand canonical conditions. * Mixed methods involving non-equilibrium statistical mechanical results (see below) may be useful.


Non-equilibrium statistical mechanics

Many physical phenomena involve quasi-thermodynamic processes out of equilibrium, for example: * heat transport by the internal motions in a material, driven by a temperature imbalance, * electric currents carried by the motion of charges in a conductor, driven by a voltage imbalance, * spontaneous
chemical reaction A chemical reaction is a process that leads to the chemistry, chemical transformation of one set of chemical substances to another. When chemical reactions occur, the atoms are rearranged and the reaction is accompanied by an Gibbs free energy, ...
s driven by a decrease in free energy, * friction, dissipation, quantum decoherence, * systems being pumped by external forces ( optical pumping, etc.), * and irreversible processes in general. All of these processes occur over time with characteristic rates. These rates are important in engineering. The field of non-equilibrium statistical mechanics is concerned with understanding these non-equilibrium processes at the microscopic level. (Statistical thermodynamics can only be used to calculate the final result, after the external imbalances have been removed and the ensemble has settled back down to equilibrium.) In principle, non-equilibrium statistical mechanics could be mathematically exact: ensembles for an isolated system evolve over time according to deterministic equations such as Liouville's equation or its quantum equivalent, the von Neumann equation. These equations are the result of applying the mechanical equations of motion independently to each state in the ensemble. These ensemble evolution equations inherit much of the complexity of the underlying mechanical motion, and so exact solutions are very difficult to obtain. Moreover, the ensemble evolution equations are fully reversible and do not destroy information (the ensemble's Gibbs entropy is preserved). In order to make headway in modelling irreversible processes, it is necessary to consider additional factors besides probability and reversible mechanics. Non-equilibrium mechanics is therefore an active area of theoretical research as the range of validity of these additional assumptions continues to be explored. A few approaches are described in the following subsections.


Stochastic methods

One approach to non-equilibrium statistical mechanics is to incorporate
stochastic Stochastic (; ) is the property of being well-described by a random probability distribution. ''Stochasticity'' and ''randomness'' are technically distinct concepts: the former refers to a modeling approach, while the latter describes phenomena; i ...
(random) behaviour into the system. Stochastic behaviour destroys information contained in the ensemble. While this is technically inaccurate (aside from hypothetical situations involving black holes, a system cannot in itself cause loss of information), the randomness is added to reflect that information of interest becomes converted over time into subtle correlations within the system, or to correlations between the system and environment. These correlations appear as chaotic or pseudorandom influences on the variables of interest. By replacing these correlations with randomness proper, the calculations can be made much easier.


Near-equilibrium methods

Another important class of non-equilibrium statistical mechanical models deals with systems that are only very slightly perturbed from equilibrium. With very small perturbations, the response can be analysed in linear response theory. A remarkable result, as formalized by the fluctuation–dissipation theorem, is that the response of a system when near equilibrium is precisely related to the fluctuations that occur when the system is in total equilibrium. Essentially, a system that is slightly away from equilibrium—whether put there by external forces or by fluctuations—relaxes towards equilibrium in the same way, since the system cannot tell the difference or "know" how it came to be away from equilibrium. This provides an indirect avenue for obtaining numbers such as ohmic conductivity and
thermal conductivity The thermal conductivity of a material is a measure of its ability to heat conduction, conduct heat. It is commonly denoted by k, \lambda, or \kappa and is measured in W·m−1·K−1. Heat transfer occurs at a lower rate in materials of low ...
by extracting results from equilibrium statistical mechanics. Since equilibrium statistical mechanics is mathematically well defined and (in some cases) more amenable for calculations, the fluctuation–dissipation connection can be a convenient shortcut for calculations in near-equilibrium statistical mechanics. A few of the theoretical tools used to make this connection include: * Fluctuation–dissipation theorem * Onsager reciprocal relations * Green–Kubo relations * Landauer–Büttiker formalism * Mori–Zwanzig formalism * GENERIC formalism


Hybrid methods

An advanced approach uses a combination of stochastic methods and linear response theory. As an example, one approach to compute quantum coherence effects ( weak localization, conductance fluctuations) in the conductance of an electronic system is the use of the Green–Kubo relations, with the inclusion of stochastic dephasing by interactions between various electrons by use of the Keldysh method.


Applications

The ensemble formalism can be used to analyze general mechanical systems with uncertainty in knowledge about the state of a system. Ensembles are also used in: * propagation of uncertainty over time, * regression analysis of gravitational
orbit In celestial mechanics, an orbit (also known as orbital revolution) is the curved trajectory of an object such as the trajectory of a planet around a star, or of a natural satellite around a planet, or of an artificial satellite around an ...
s, * ensemble forecasting of weather, * dynamics of
neural networks A neural network is a group of interconnected units called neurons that send signals to one another. Neurons can be either Cell (biology), biological cells or signal pathways. While individual neurons are simple, many of them together in a netwo ...
, * bounded-rational potential games in
game theory Game theory is the study of mathematical models of strategic interactions. It has applications in many fields of social science, and is used extensively in economics, logic, systems science and computer science. Initially, game theory addressed ...
and non-equilibrium economics. Statistical physics explains and quantitatively describes superconductivity,
superfluidity Superfluidity is the characteristic property of a fluid with zero viscosity which therefore flows without any loss of kinetic energy. When stirred, a superfluid forms vortices that continue to rotate indefinitely. Superfluidity occurs in two ...
,
turbulence In fluid dynamics, turbulence or turbulent flow is fluid motion characterized by chaotic changes in pressure and flow velocity. It is in contrast to laminar flow, which occurs when a fluid flows in parallel layers with no disruption between ...
, collective phenomena in
solid Solid is a state of matter where molecules are closely packed and can not slide past each other. Solids resist compression, expansion, or external forces that would alter its shape, with the degree to which they are resisted dependent upon the ...
s and plasma, and the structural features of
liquid Liquid is a state of matter with a definite volume but no fixed shape. Liquids adapt to the shape of their container and are nearly incompressible, maintaining their volume even under pressure. The density of a liquid is usually close to th ...
. It underlies the modern astrophysics and virial theorem. In solid state physics, statistical physics aids the study of liquid crystals,
phase transition In physics, chemistry, and other related fields like biology, a phase transition (or phase change) is the physical process of transition between one state of a medium and another. Commonly the term is used to refer to changes among the basic Sta ...
s, and critical phenomena. Many experimental studies of matter are entirely based on the statistical description of a system. These include the scattering of cold
neutron The neutron is a subatomic particle, symbol or , that has no electric charge, and a mass slightly greater than that of a proton. The Discovery of the neutron, neutron was discovered by James Chadwick in 1932, leading to the discovery of nucle ...
s,
X-ray An X-ray (also known in many languages as Röntgen radiation) is a form of high-energy electromagnetic radiation with a wavelength shorter than those of ultraviolet rays and longer than those of gamma rays. Roughly, X-rays have a wavelength ran ...
, visible light, and more. Statistical physics also plays a role in materials science, nuclear physics, astrophysics, chemistry, biology and medicine (e.g. study of the spread of infectious diseases). Analytical and computational techniques derived from statistical physics of disordered systems, can be extended to large-scale problems, including machine learning, e.g., to analyze the weight space of deep
neural networks A neural network is a group of interconnected units called neurons that send signals to one another. Neurons can be either Cell (biology), biological cells or signal pathways. While individual neurons are simple, many of them together in a netwo ...
. Statistical physics is thus finding applications in the area of medical diagnostics.


Quantum statistical mechanics

Quantum statistical mechanics is statistical mechanics applied to quantum mechanical systems. In quantum mechanics, a statistical ensemble (probability distribution over possible
quantum state In quantum physics, a quantum state is a mathematical entity that embodies the knowledge of a quantum system. Quantum mechanics specifies the construction, evolution, and measurement of a quantum state. The result is a prediction for the system ...
s) is described by a density operator ''S'', which is a non-negative, self-adjoint, trace-class operator of trace 1 on the
Hilbert space In mathematics, a Hilbert space is a real number, real or complex number, complex inner product space that is also a complete metric space with respect to the metric induced by the inner product. It generalizes the notion of Euclidean space. The ...
''H'' describing the quantum system. This can be shown under various mathematical formalisms for quantum mechanics. One such formalism is provided by
quantum logic In the mathematical study of logic and the physical analysis of quantum foundations, quantum logic is a set of rules for manip­ulation of propositions inspired by the structure of quantum theory. The formal system takes as its starting p ...
.


Index of statistical mechanics topics


Physics

* Probability amplitude * Statistical physics * Boltzmann factor * Feynman–Kac formula * Fluctuation theorem * Information entropy * Vacuum expectation value * Cosmic variance * Negative probability * Gibbs state * Master equation *
Partition function (mathematics) The partition function or configuration integral, as used in probability theory, information theory and dynamical systems, is a generalization of the definition of a partition function in statistical mechanics. It is a special case of a normaliz ...
* Quantum probability


Percolation theory

* Percolation theory * Schramm–Loewner evolution


See also

* List of textbooks in thermodynamics and statistical mechanics *


References


Further reading

* * * * *


External links


Philosophy of Statistical Mechanics
article by Lawrence Sklar for the
Stanford Encyclopedia of Philosophy The ''Stanford Encyclopedia of Philosophy'' (''SEP'') is a freely available online philosophy resource published and maintained by Stanford University, encompassing both an online encyclopedia of philosophy and peer-reviewed original publication ...
.
Sklogwiki - Thermodynamics, statistical mechanics, and the computer simulation of materials.
SklogWiki is particularly orientated towards liquids and soft condensed matter.
Thermodynamics and Statistical Mechanics
by Richard Fitzpatrick * * taught by Leonard Susskind. * Vu-Quoc, L.
Configuration integral (statistical mechanics)
2008. this wiki site is down; se
this article in the web archive on 2012 April 28
{{Authority control Statistical mechanics Thermodynamics