HOME

TheInfoList



OR:

In
physics Physics is the natural science that studies matter, its fundamental constituents, its motion and behavior through space and time, and the related entities of energy and force. "Physical science is that department of knowledge which rel ...
and
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
, Mean-field theory (MFT) or Self-consistent field theory studies the behavior of high-dimensional random ( stochastic) models by studying a simpler model that approximates the original by averaging over
degrees of freedom Degrees of freedom (often abbreviated df or DOF) refers to the number of independent variables or parameters of a thermodynamic system. In various scientific fields, the word "freedom" is used to describe the limits to which physical movement or ...
(the number of values in the final calculation of a
statistic A statistic (singular) or sample statistic is any quantity computed from values in a sample which is considered for a statistical purpose. Statistical purposes include estimating a population parameter, describing a sample, or evaluating a hy ...
that are free to vary). Such models consider many individual components that interact with each other. The main idea of MFT is to replace all interactions to any one body with an average or effective interaction, sometimes called a ''molecular field''. This reduces any many-body problem into an effective
one-body problem In classical mechanics, the two-body problem is to predict the motion of two massive objects which are abstractly viewed as point particles. The problem assumes that the two objects interact only with one another; the only force affecting each ...
. The ease of solving MFT problems means that some insight into the behavior of the system can be obtained at a lower computational cost. MFT has since been applied to a wide range of fields outside of physics, including statistical inference, graphical models,
neuroscience Neuroscience is the science, scientific study of the nervous system (the brain, spinal cord, and peripheral nervous system), its functions and disorders. It is a Multidisciplinary approach, multidisciplinary science that combines physiology, an ...
,
artificial intelligence Artificial intelligence (AI) is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machine A machine is a physical system using Power (physics), power to apply Force, forces and control Motion, moveme ...
, epidemic models,
queueing theory Queueing theory is the mathematical study of waiting lines, or queues. A queueing model is constructed so that queue lengths and waiting time can be predicted. Queueing theory is generally considered a branch of operations research because the ...
, computer-network performance and game theory, as in the quantal response equilibrium.


Origins

The idea first appeared in physics ( statistical mechanics) in the work of
Pierre Curie Pierre Curie ( , ; 15 May 1859 – 19 April 1906) was a French physicist, a pioneer in crystallography, magnetism, piezoelectricity, and radioactivity. In 1903, he received the Nobel Prize in Physics with his wife, Marie Curie, and Henri Becq ...
and Pierre Weiss to describe phase transitions. MFT has been used in the Bragg–Williams approximation, models on
Bethe lattice In statistical mechanics and mathematics, the Bethe lattice (also called a regular tree) is an infinite connected cycle-free graph where all vertices have the same number of neighbors. The Bethe lattice was introduced into the physics literatur ...
,
Landau theory Landau theory in physics is a theory that Lev Landau introduced in an attempt to formulate a general theory of continuous (i.e., second-order) phase transitions. It can also be adapted to systems under externally-applied fields, and used as a qua ...
, Pierre–Weiss approximation, Flory–Huggins solution theory, and Scheutjens–Fleer theory. Systems with many (sometimes infinite) degrees of freedom are generally hard to solve exactly or compute in closed, analytic form, except for some simple cases (e.g. certain Gaussian random-field theories, the 1D Ising model). Often combinatorial problems arise that make things like computing the partition function of a system difficult. MFT is an approximation method that often makes the original solvable and open to calculation, and in some cases MFT may give very accurate approximations. In field theory, the Hamiltonian may be expanded in terms of the magnitude of fluctuations around the mean of the field. In this context, MFT can be viewed as the "zeroth-order" expansion of the Hamiltonian in fluctuations. Physically, this means that an MFT system has no fluctuations, but this coincides with the idea that one is replacing all interactions with a "mean-field”. Quite often, MFT provides a convenient launch point for studying higher-order fluctuations. For example, when computing the partition function, studying the
combinatorics Combinatorics is an area of mathematics primarily concerned with counting, both as a means and an end in obtaining results, and certain properties of finite structures. It is closely related to many other areas of mathematics and has many a ...
of the interaction terms in the Hamiltonian can sometimes at best produce perturbation results or
Feynman diagram In theoretical physics, a Feynman diagram is a pictorial representation of the mathematical expressions describing the behavior and interaction of subatomic particles. The scheme is named after American physicist Richard Feynman, who introdu ...
s that correct the mean-field approximation.


Validity

In general, dimensionality plays an active role in determining whether a mean-field approach will work for any particular problem. There is sometimes a critical dimension above which MFT is valid and below which it is not. Heuristically, many interactions are replaced in MFT by one effective interaction. So if the field or particle exhibits many random interactions in the original system, they tend to cancel each other out, so the mean effective interaction and MFT will be more accurate. This is true in cases of high dimensionality, when the Hamiltonian includes long-range forces, or when the particles are extended (e.g. polymers). The
Ginzburg criterion Mean field theory gives sensible results as long as one is able to neglect fluctuations in the system under consideration. The Ginzburg criterion tells quantitatively when mean field theory is valid. It also gives the idea of an upper critical dime ...
is the formal expression of how fluctuations render MFT a poor approximation, often depending upon the number of spatial dimensions in the system of interest.


Formal approach (Hamiltonian)

The formal basis for mean-field theory is the Bogoliubov inequality. This inequality states that the free energy of a system with Hamiltonian : \mathcal = \mathcal_0 + \Delta \mathcal has the following upper bound: : F \leq F_0 \ \stackrel\ \langle \mathcal \rangle_0 - T S_0, where S_0 is the
entropy Entropy is a scientific concept, as well as a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodyna ...
, and F and F_0 are Helmholtz free energies. The average is taken over the equilibrium ensemble of the reference system with Hamiltonian \mathcal_0. In the special case that the reference Hamiltonian is that of a non-interacting system and can thus be written as : \mathcal_0 = \sum_^N h_i(\xi_i), where \xi_i are the
degrees of freedom Degrees of freedom (often abbreviated df or DOF) refers to the number of independent variables or parameters of a thermodynamic system. In various scientific fields, the word "freedom" is used to describe the limits to which physical movement or ...
of the individual components of our statistical system (atoms, spins and so forth), one can consider sharpening the upper bound by minimising the right side of the inequality. The minimising reference system is then the "best" approximation to the true system using non-correlated degrees of freedom and is known as the mean field approximation. For the most common case that the target Hamiltonian contains only pairwise interactions, i.e., : \mathcal = \sum_ V_(\xi_i, \xi_j), where \mathcal is the set of pairs that interact, the minimising procedure can be carried out formally. Define \operatorname_i f(\xi_i) as the generalized sum of the observable f over the degrees of freedom of the single component (sum for discrete variables, integrals for continuous ones). The approximating free energy is given by :\begin F_0 &= \operatorname_ \mathcal(\xi_1, \xi_2, \ldots, \xi_N) P^_0(\xi_1, \xi_2, \ldots, \xi_N) \\ &+ kT \,\operatorname_ P^_0(\xi_1, \xi_2, \ldots, \xi_N) \log P^_0(\xi_1, \xi_2, \ldots,\xi_N), \end where P^_0(\xi_1, \xi_2, \dots, \xi_N) is the probability to find the reference system in the state specified by the variables (\xi_1, \xi_2, \dots, \xi_N). This probability is given by the normalized Boltzmann factor : \begin P^_0(\xi_1, \xi_2, \ldots, \xi_N) &= \frac e^ \\ &= \prod_^N \frac e^ \ \stackrel\ \prod_^N P^_0(\xi_i), \end where Z_0 is the partition function. Thus :\begin F_0 &= \sum_ \operatorname_ V_(\xi_i, \xi_j) P^_0(\xi_i) P^_0(\xi_j) \\ &+ kT \sum_^N \operatorname_i P^_0(\xi_i) \log P^_0(\xi_i). \end In order to minimise, we take the derivative with respect to the single-degree-of-freedom probabilities P^_0 using a Lagrange multiplier to ensure proper normalization. The end result is the set of self-consistency equations : P^_0(\xi_i) = \frac e^,\quad i = 1, 2, \ldots, N, where the mean field is given by : h_i^\text(\xi_i) = \sum_ \operatorname_j V_(\xi_i, \xi_j) P^_0(\xi_j).


Applications

Mean field theory can be applied to a number of physical systems so as to study phenomena such as phase transitions.


Ising model


Formal derivation

The Bogoliubov inequality, shown above, can be used to find the dynamics of a mean field model of the two-dimensional
Ising lattice The Ising model () (or Lenz-Ising model or Ising-Lenz model), named after the physicists Ernst Ising and Wilhelm Lenz, is a mathematical model of ferromagnetism in statistical mechanics. The model consists of discrete variables that represent ...
. A magnetisation function can be calculated from the resultant approximate free energy. The first step is choosing a more tractable approximation of the true Hamiltonian. Using a non-interacting or effective field Hamiltonian, : -m \sum_i s_i , the variational free energy is : F_V = F_0 + \left \langle \left( -J \sum s_i s_j - h \sum s_i \right) - \left(-m\sum s_i\right) \right \rangle_0. By the Bogoliubov inequality, simplifying this quantity and calculating the magnetisation function that minimises the variational free energy yields the best approximation to the actual magnetisation. The minimiser is : m = J\sum\langle s_j \rangle_0 + h, which is the ensemble average of spin. This simplifies to : m = \text(zJ\beta m) + h. Equating the effective field felt by all spins to a mean spin value relates the variational approach to the suppression of fluctuations. The physical interpretation of the magnetisation function is then a field of mean values for individual spins.


Non-interacting spins approximation

Consider the Ising model on a d-dimensional lattice. The Hamiltonian is given by : H = -J \sum_ s_i s_j - h \sum_i s_i, where the \sum_ indicates summation over the pair of nearest neighbors \langle i, j \rangle, and s_i, s_j = \pm 1 are neighboring Ising spins. Let us transform our spin variable by introducing the fluctuation from its mean value m_i \equiv \langle s_i \rangle. We may rewrite the Hamiltonian as : H = -J \sum_ (m_i + \delta s_i) (m_j + \delta s_j) - h \sum_i s_i, where we define \delta s_i \equiv s_i - m_i; this is the ''fluctuation'' of the spin. If we expand the right side, we obtain one term that is entirely dependent on the mean values of the spins and independent of the spin configurations. This is the trivial term, which does not affect the statistical properties of the system. The next term is the one involving the product of the mean value of the spin and the fluctuation value. Finally, the last term involves a product of two fluctuation values. The mean field approximation consists of neglecting this second-order fluctuation term: : H \approx H^\text \equiv -J \sum_ (m_i m_j + m_i \delta s_j + m_j \delta s_i) - h \sum_i s_i. These fluctuations are enhanced at low dimensions, making MFT a better approximation for high dimensions. Again, the summand can be re-expanded. In addition, we expect that the mean value of each spin is site-independent, since the Ising chain is translationally invariant. This yields : H^\text = -J \sum_ \big(m^2 + 2m(s_i - m)\big) - h \sum_i s_i. The summation over neighboring spins can be rewritten as \sum_ = \frac \sum_i \sum_, where nn(i) means "nearest neighbor of i", and the 1/2 prefactor avoids double counting, since each bond participates in two spins. Simplifying leads to the final expression : H^\text = \frac - \underbrace_ \sum_i s_i, where z is the coordination number. At this point, the Ising Hamiltonian has been ''decoupled'' into a sum of one-body Hamiltonians with an ''effective mean field'' h^\text = h + J z m, which is the sum of the external field h and of the ''mean field'' induced by the neighboring spins. It is worth noting that this mean field directly depends on the number of nearest neighbors and thus on the dimension of the system (for instance, for a hypercubic lattice of dimension d, z = 2 d). Substituting this Hamiltonian into the partition function and solving the effective 1D problem, we obtain : Z = e^ \left \cosh\left(\frac\right)\rightN, where N is the number of lattice sites. This is a closed and exact expression for the partition function of the system. We may obtain the free energy of the system and calculate critical exponents. In particular, we can obtain the magnetization m as a function of h^\text. We thus have two equations between m and h^\text, allowing us to determine m as a function of temperature. This leads to the following observation: * For temperatures greater than a certain value T_\text, the only solution is m = 0. The system is paramagnetic. * For T < T_\text, there are two non-zero solutions: m = \pm m_0. The system is ferromagnetic. T_\text is given by the following relation: T_\text = \frac. This shows that MFT can account for the ferromagnetic phase transition.


Application to other systems

Similarly, MFT can be applied to other types of Hamiltonian as in the following cases: * To study the metal– superconductor transition. In this case, the analog of the magnetization is the superconducting gap \Delta. * The molecular field of a liquid crystal that emerges when the Laplacian of the director field is non-zero. * To determine the optimal
amino acid Amino acids are organic compounds that contain both amino and carboxylic acid functional groups. Although hundreds of amino acids exist in nature, by far the most important are the alpha-amino acids, which comprise proteins. Only 22 alpha ...
side chain packing given a fixed protein backbone in protein structure prediction (see
Self-consistent mean field (biology) The self-consistent mean field (SCMF) method is an adaptation of mean field theory used in protein structure prediction to determine the optimal amino acid side chain packing given a fixed protein backbone. It is faster but less accurate than dead ...
). * To determine the elastic properties of a composite material. Variationally minimisation like mean field theory can be also be used in statistical inference.


Extension to time-dependent mean fields

In mean field theory, the mean field appearing in the single-site problem is a time-independent scalar or vector quantity. However, this isn't always the case: in a variant of mean field theory called dynamical mean field theory (DMFT), the mean field becomes a time-dependent quantity. For instance, DMFT can be applied to the Hubbard model to study the metal–Mott-insulator transition.


See also

* Dynamical mean field theory * Mean field game theory *
Generalized epidemic mean field model A generalization is a form of abstraction whereby common properties of specific instances are formulated as general concepts or claims. Generalizations posit the existence of a domain or set of elements, as well as one or more common character ...


References

{{DEFAULTSORT:Mean Field Theory Statistical mechanics Concepts in physics Electronic structure methods