Correlation Inequality
A correlation inequality is any of a number of inequalities satisfied by the correlation functions of a model. Such inequalities are of particular use in statistical mechanics and in percolation theory. Examples include: * Bell's inequality * FKG inequality *Griffiths inequality, and its generalisation, the Ginibre inequality *Gaussian correlation inequality The Gaussian correlation inequality (GCI), formerly known as the Gaussian correlation conjecture (GCC), is a mathematical theorem in the fields of mathematical statistics and convex geometry. The statement The Gaussian correlation inequality stat ... References External links * Probabilistic inequalities Statistical mechanics Inequalities {{statisticalmechanics-stub ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Correlation Function
A correlation function is a function that gives the statistical correlation between random variables, contingent on the spatial or temporal distance between those variables. If one considers the correlation function between random variables representing the same quantity measured at two different points, then this is often referred to as an autocorrelation function, which is made up of autocorrelations. Correlation functions of different random variables are sometimes called cross-correlation functions to emphasize that different variables are being considered and because they are made up of cross-correlations. Correlation functions are a useful indicator of dependencies as a function of distance in time or space, and they can be used to assess the distance required between sample points for the values to be effectively uncorrelated. In addition, they can form the basis of rules for interpolating values at points for which there are no observations. Correlation functions used ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Statistical Mechanics
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles. Statistical mechanics arose out of the development of classical thermodynamics, a field for which it was successful in explaining macroscopic physical properties—such as temperature, pressure, and heat capacity—in terms of microscopic parameters that fluctuate about average values and are characterized by probability distributions. This established the fields of statistical thermodynamics and statistical physics. The founding of the field of statistical mechanics is generally credited to three physicists: *Ludwig Boltzmann, who developed the fundamental interpretation of entropy in terms of a collection of microstates *James Clerk Maxwell, who developed models of probability distr ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Percolation Theory
In statistical physics and mathematics, percolation theory describes the behavior of a network when nodes or links are added. This is a geometric type of phase transition, since at a critical fraction of addition the network of small, disconnected clusters merge into significantly larger connected, so-called spanning clusters. The applications of percolation theory to materials science and in many other disciplines are discussed here and in the articles network theory and percolation. Introduction A representative question (and the source of the name) is as follows. Assume that some liquid is poured on top of some porous material. Will the liquid be able to make its way from hole to hole and reach the bottom? This physical question is modelled mathematically as a three-dimensional network of vertices, usually called "sites", in which the edge or "bonds" between each two neighbors may be open (allowing the liquid through) with probability , or closed with probability , and th ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
FKG Inequality
In mathematics, the Fortuin–Kasteleyn–Ginibre (FKG) inequality is a correlation inequality, a fundamental tool in statistical mechanics and probabilistic combinatorics (especially random graphs and the probabilistic method), due to . Informally, it says that in many random systems, increasing events are positively correlated, while an increasing and a decreasing event are negatively correlated. It was obtained by studying the random cluster model. An earlier version, for the special case of i.i.d. variables, called Harris inequality, is due to , see below. One generalization of the FKG inequality is the Holley inequality (1974) below, and an even further generalization is the Ahlswede–Daykin "four functions" theorem (1978). Furthermore, it has the same conclusion as the Griffiths inequalities, but the hypotheses are different. The inequality Let X be a finite distributive lattice, and ''μ'' a nonnegative function on it, that is assumed to satisfy the (FKG) lattice condit ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Griffiths Inequality
In statistical mechanics, the Griffiths inequality, sometimes also called Griffiths–Kelly–Sherman inequality or GKS inequality, named after Robert B. Griffiths, is a correlation inequality for ferromagnetic spin systems. Informally, it says that in ferromagnetic spin systems, if the 'a-priori distribution' of the spin is invariant under spin flipping, the correlation of any monomial of the spins is non-negative; and the two point correlation of two monomial of the spins is non-negative. The inequality was proved by Griffiths for Ising ferromagnets with two-body interactions, then generalised by Kelly and Sherman to interactions involving an arbitrary number of spins, and then by Griffiths to systems with arbitrary spins. A more general formulation was given by Ginibre, and is now called the Ginibre inequality. Definitions Let \textstyle \sigma=\_ be a configuration of (continuous or discrete) spins on a lattice ''Λ''. If ''A'' ⊂ ''Λ'' is a list of lattice sites ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Ginibre Inequality
In statistical mechanics, the Griffiths inequality, sometimes also called Griffiths–Kelly–Sherman inequality or GKS inequality, named after Robert B. Griffiths, is a correlation inequality for ferromagnetic spin systems. Informally, it says that in ferromagnetic spin systems, if the 'a-priori distribution' of the spin is invariant under spin flipping, the correlation of any monomial of the spins is non-negative; and the two point correlation of two monomial of the spins is non-negative. The inequality was proved by Griffiths for Ising ferromagnets with two-body interactions, then generalised by Kelly and Sherman to interactions involving an arbitrary number of spins, and then by Griffiths to systems with arbitrary spins. A more general formulation was given by Ginibre, and is now called the Ginibre inequality. Definitions Let \textstyle \sigma=\_ be a configuration of (continuous or discrete) spins on a lattice ''Λ''. If ''A'' ⊂ ''Λ'' is a list of lattice sites, ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Gaussian Correlation Inequality
The Gaussian correlation inequality (GCI), formerly known as the Gaussian correlation conjecture (GCC), is a mathematical theorem in the fields of mathematical statistics and convex geometry. The statement The Gaussian correlation inequality states: Let \mu be an ''n''-dimensional Gaussian probability measure on \mathbb^n , i.e. \mu a multivariate normal distribution, centered at the origin. Then for all convex sets E,F \subset \mathbb^n that are symmetric about the origin, : \mu(E \cap F) \geq \mu(E) \cdot \mu(F). As a simple example for ''n''=2, one can think of darts being thrown at a board, with their landing spots in the plane distributed according to a 2-variable normal distribution centered at the origin. (This is a reasonable assumption for any given darts player, with different players being described by different normal distributions.) If we now consider a circle and a rectangle in the plane, both centered at the origin, then the proportion of the darts landing i ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Probabilistic Inequalities
Probability is the branch of mathematics concerning numerical descriptions of how likely an Event (probability theory), event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and 1, where, roughly speaking, 0 indicates impossibility of the event and 1 indicates certainty."Kendall's Advanced Theory of Statistics, Volume 1: Distribution Theory", Alan Stuart and Keith Ord, 6th Ed, (2009), .William Feller, ''An Introduction to Probability Theory and Its Applications'', (Vol 1), 3rd Ed, (1968), Wiley, . The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes ("heads" and "tails") are both equally probable; the probability of "heads" equals the probability of "tails"; and since no other outcomes are possible, the probability of either "heads" or "tails" is 1/2 (which could also be written ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |
|
Statistical Mechanics
In physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles. Statistical mechanics arose out of the development of classical thermodynamics, a field for which it was successful in explaining macroscopic physical properties—such as temperature, pressure, and heat capacity—in terms of microscopic parameters that fluctuate about average values and are characterized by probability distributions. This established the fields of statistical thermodynamics and statistical physics. The founding of the field of statistical mechanics is generally credited to three physicists: *Ludwig Boltzmann, who developed the fundamental interpretation of entropy in terms of a collection of microstates *James Clerk Maxwell, who developed models of probability distr ... [...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]   |