HOME
*





Varadhan's Lemma
In mathematics, Varadhan's lemma is a result from large deviations theory named after S. R. Srinivasa Varadhan. The result gives information on the asymptotic distribution of a statistic ''φ''(''Z''''ε'') of a family of random variables ''Z''''ε'' as ''ε'' becomes small in terms of a rate function for the variables. Statement of the lemma Let ''X'' be a regular topological space; let (''Z''''ε'')''ε''>0 be a family of random variables taking values in ''X''; let ''μ''''ε'' be the law (probability measure) of ''Z''''ε''. Suppose that (''μ''''ε'')''ε''>0 satisfies the large deviation principle with good rate function ''I'' : ''X'' →  , +∞ Let ''ϕ''  : ''X'' → R be any continuous function. Suppose that at least one of the following two conditions holds true: either the tail condition :\lim_ \limsup_ \big(\varepsilon \log \mathbf \big \exp\big(\phi(Z_) / \varepsilon\big)\,\mathbf\big(\phi(Z_) \geq M\big) \bigbi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mathematics
Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics with the major subdisciplines of number theory, algebra, geometry, and analysis, respectively. There is no general consensus among mathematicians about a common definition for their academic discipline. Most mathematical activity involves the discovery of properties of abstract objects and the use of pure reason to prove them. These objects consist of either abstractions from nature orin modern mathematicsentities that are stipulated to have certain properties, called axioms. A ''proof'' consists of a succession of applications of deductive rules to already established results. These results include previously proved theorems, axioms, andin case of abstraction from naturesome basic properties that are considered true starting points of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Large Deviations Theory
In probability theory, the theory of large deviations concerns the asymptotic behaviour of remote tails of sequences of probability distributions. While some basic ideas of the theory can be traced to Laplace, the formalization started with insurance mathematics, namely ruin theory with Cramér and Lundberg. A unified formalization of large deviation theory was developed in 1966, in a paper by Varadhan. Large deviations theory formalizes the heuristic ideas of ''concentration of measures'' and widely generalizes the notion of convergence of probability measures. Roughly speaking, large deviations theory concerns itself with the exponential decline of the probability measures of certain kinds of extreme or ''tail'' events. Introductory examples An elementary example Consider a sequence of independent tosses of a fair coin. The possible outcomes could be heads or tails. Let us denote the possible outcome of the i-th trial by where we encode head as 1 and tail as 0. Now let M_N ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Asymptotic Analysis
In mathematical analysis, asymptotic analysis, also known as asymptotics, is a method of describing limiting behavior. As an illustration, suppose that we are interested in the properties of a function as becomes very large. If , then as becomes very large, the term becomes insignificant compared to . The function is said to be "''asymptotically equivalent'' to , as ". This is often written symbolically as , which is read as " is asymptotic to ". An example of an important asymptotic result is the prime number theorem. Let denote the prime-counting function (which is not directly related to the constant pi), i.e. is the number of prime numbers that are less than or equal to . Then the theorem states that \pi(x)\sim\frac. Asymptotic analysis is commonly used in computer science as part of the analysis of algorithms and is often expressed there in terms of big O notation. Definition Formally, given functions and , we define a binary relation f(x) \sim g(x) \qu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Random Variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the possible upper sides of a flipped coin such as heads H and tails T) in a sample space (e.g., the set \) to a measurable space, often the real numbers (e.g., \ in which 1 corresponding to H and -1 corresponding to T). Informally, randomness typically represents some fundamental element of chance, such as in the roll of a dice; it may also represent uncertainty, such as measurement error. However, the interpretation of probability is philosophically complicated, and even in specific cases is not always straightforward. The purely mathematical analysis of random variables is independent of such interpretational difficulties, and can be based upon a rigorous axiomatic setup. In the formal mathematical language of measure theory, a random var ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Rate Function
In mathematics — specifically, in large deviations theory — a rate function is a function used to quantify the probabilities of rare events. It is required to have several properties which assist in the formulation of the large deviation principle. In some sense, the large deviation principle is an analogue of weak convergence of probability measures, but one which takes account of how well the rare events behave. A rate function is also called a Cramér function, after the Swedish probabilist Harald Cramér. Definitions Rate function An extended real-valued function ''I'' : ''X'' →  , +∞defined on a Hausdorff topological space ''X'' is said to be a rate function if it is not identically +∞ and is lower semi-continuous, i.e. all the sub-level sets :\ \mbox c \geq 0 are closed in ''X''. If, furthermore, they are compact, then ''I'' is said to be a good rate function. A family of probability measures (''μ''''δ'')''δ'' >  ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Regular Space
In topology and related fields of mathematics, a topological space ''X'' is called a regular space if every closed subset ''C'' of ''X'' and a point ''p'' not contained in ''C'' admit non-overlapping open neighborhoods. Thus ''p'' and ''C'' can be separated by neighborhoods. This condition is known as Axiom T3. The term "T3 space" usually means "a regular Hausdorff space". These conditions are examples of separation axioms. Definitions A topological space ''X'' is a regular space if, given any closed set ''F'' and any point ''x'' that does not belong to ''F'', there exists a neighbourhood ''U'' of ''x'' and a neighbourhood ''V'' of ''F'' that are disjoint. Concisely put, it must be possible to separate ''x'' and ''F'' with disjoint neighborhoods. A or is a topological space that is both regular and a Hausdorff space. (A Hausdorff space or T2 space is a topological space in which any two distinct points are separated by neighbourhoods.) It turns out that a space is T3 if a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Measure
In mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as ''countable additivity''. The difference between a probability measure and the more general notion of measure (which includes concepts like area or volume) is that a probability measure must assign value 1 to the entire probability space. Intuitively, the additivity property says that the probability assigned to the union of two disjoint events by the measure should be the sum of the probabilities of the events; for example, the value assigned to "1 or 2" in a throw of a dice should be the sum of the values assigned to "1" and "2". Probability measures have applications in diverse fields, from physics to finance and biology. Definition The requirements for a function \mu to be a probability measure on a probability space are that: * \mu must return results in the unit interval , 1 returning 0 for the empty set and 1 for t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Large Deviation Principle
In mathematics — specifically, in large deviations theory — a rate function is a function used to quantify the probabilities of rare events. It is required to have several properties which assist in the formulation of the large deviation principle. In some sense, the large deviation principle is an analogue of weak convergence of probability measures, but one which takes account of how well the rare events behave. A rate function is also called a Cramér function, after the Swedish probabilist Harald Cramér. Definitions Rate function An extended real-valued function ''I'' : ''X'' →  , +∞defined on a Hausdorff topological space ''X'' is said to be a rate function if it is not identically +∞ and is lower semi-continuous, i.e. all the sub-level sets :\ \mbox c \geq 0 are closed in ''X''. If, furthermore, they are compact, then ''I'' is said to be a good rate function. A family of probability measures (''μ''''δ'')''δ'' >&n ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Continuous Function
In mathematics, a continuous function is a function such that a continuous variation (that is a change without jump) of the argument induces a continuous variation of the value of the function. This means that there are no abrupt changes in value, known as '' discontinuities''. More precisely, a function is continuous if arbitrarily small changes in its value can be assured by restricting to sufficiently small changes of its argument. A discontinuous function is a function that is . Up until the 19th century, mathematicians largely relied on intuitive notions of continuity, and considered only continuous functions. The epsilon–delta definition of a limit was introduced to formalize the definition of continuity. Continuity is one of the core concepts of calculus and mathematical analysis, where arguments and values of functions are real and complex numbers. The concept has been generalized to functions between metric spaces and between topological spaces. The latter are the mo ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Indicator Function
In mathematics, an indicator function or a characteristic function of a subset of a set is a function that maps elements of the subset to one, and all other elements to zero. That is, if is a subset of some set , one has \mathbf_(x)=1 if x\in A, and \mathbf_(x)=0 otherwise, where \mathbf_A is a common notation for the indicator function. Other common notations are I_A, and \chi_A. The indicator function of is the Iverson bracket of the property of belonging to ; that is, :\mathbf_(x)= \in A For example, the Dirichlet function is the indicator function of the rational numbers as a subset of the real numbers. Definition The indicator function of a subset of a set is a function \mathbf_A \colon X \to \ defined as \mathbf_A(x) := \begin 1 ~&\text~ x \in A~, \\ 0 ~&\text~ x \notin A~. \end The Iverson bracket provides the equivalent notation, \in A/math> or to be used instead of \mathbf_(x)\,. The function \mathbf_A is sometimes denoted , , , or even just . Nota ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Moment (mathematics)
In mathematics, the moments of a function are certain quantitative measures related to the shape of the function's graph. If the function represents mass density, then the zeroth moment is the total mass, the first moment (normalized by total mass) is the center of mass, and the second moment is the moment of inertia. If the function is a probability distribution, then the first moment is the expected value, the second central moment is the variance, the third standardized moment is the skewness, and the fourth standardized moment is the kurtosis. The mathematical concept is closely related to the concept of moment in physics. For a distribution of mass or probability on a bounded interval, the collection of all the moments (of all orders, from to ) uniquely determines the distribution (Hausdorff moment problem). The same is not true on unbounded intervals (Hamburger moment problem). In the mid-nineteenth century, Pafnuty Chebyshev became the first person to think systematic ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Laplace Principle (large Deviations Theory)
In mathematics, Laplace's principle is a basic theorem in large deviations theory which is similar to Varadhan's lemma. It gives an asymptotic expression for the Lebesgue integral of exp(−''θφ''(''x'')) over a fixed set ''A'' as ''θ'' becomes large. Such expressions can be used, for example, in statistical mechanics to determining the limiting behaviour of a system as the temperature tends to absolute zero. Statement of the result Let ''A'' be a Lebesgue-measurable subset of ''d''-dimensional Euclidean space R''d'' and let ''φ'' : R''d'' → R be a measurable function with :\int_A e^ \,dx < \infty. Then :\lim_ \frac1 \log \int_A e^ \, dx = - \mathop_ \varphi(x), where ess inf denotes the . Heuristically, this may be read as saying that for large ''θ'', :
[...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]