Azuma's Inequality
   HOME
*





Azuma's Inequality
In probability theory, the Azuma–Hoeffding inequality (named after Kazuoki Azuma and Wassily Hoeffding) gives a concentration inequality, concentration result for the values of martingale (probability theory), martingales that have bounded differences. Suppose \ is a Martingale (probability theory), martingale (or Martingale_(probability_theory)#Submartingales.2C_supermartingales.2C_and_relationship_to_harmonic_functions, super-martingale) and :, X_k - X_, \leq c_k, \, almost surely. Then for all positive integers ''N'' and all positive real number, reals ''\epsilon'', :\text(X_N - X_0 \geq \epsilon) \leq \exp\left ( \right). And symmetrically (when ''X''''k'' is a sub-martingale): :\text(X_N - X_0 \leq -\epsilon) \leq \exp\left ( \right). If ''X'' is a martingale, using both inequalities above and applying the union bound allows one to obtain a two-sided bound: :\text(, X_N - X_0, \geq \epsilon) \leq 2\exp\left ( \right). Proof The proof shares similar idea of the pr ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in a random fashion). Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probability ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Doob Decomposition Theorem
In the theory of stochastic processes in discrete time, a part of the mathematical theory of probability, the Doob decomposition theorem gives a unique decomposition of every adapted and integrable stochastic process as the sum of a martingale and a predictable process (or "drift") starting at zero. The theorem was proved by and is named for Joseph L. Doob. The analogous theorem in the continuous-time case is the Doob–Meyer decomposition theorem. Statement Let (\Omega, \mathcal, \mathbb) be a probability space, with N \in \N or I = \N_0 a finite or an infinite index set, (\mathcal_n)_ a filtration of \mathcal, and an adapted stochastic process with for all . Then there exist a martingale and an integrable predictable process starting with such that for every . Here predictable means that is \mathcal_-measurable for every . This decomposition is almost surely unique. Remark The theorem is valid word by word also for stochastic processes taking values in the -dim ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Tôhoku Mathematical Journal
The ''Tohoku Mathematical Journal'' is a mathematical research journal published by Tohoku University in Japan. It was founded in August 1911 by Tsuruichi Hayashi. History Due to World War II the publication of the journal stopped in 1943 with volume 49. Publication was resumed in 1949 with the volume numbering starting again at 1. In order to distinguish between the identical numbered volumes, volumes in the first publishing period are referred to as the ''first series'' whereas the later volumes are called ''second series''. Before volume 51 of the second series the journal was called ''Tôhoku Mathematical Journal'', with a circumflex over the second letter of ''Tohoku''. Selected papers *. The first publication of the Sprague–Grundy theorem, the basis for much of combinatorial game theory, later independently rediscovered by P. M. Grundy. *. This paper describes Weiszfeld's algorithm for finding the geometric median. *. This paper, often referred to as " The Tohoku pape ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Concentration Inequality
In probability theory, concentration inequalities provide bounds on how a random variable deviates from some value (typically, its expected value). The law of large numbers of classical probability theory states that sums of independent random variables are, under very mild conditions, close to their expectation with a large probability. Such sums are the most basic examples of random variables concentrated around their mean. Recent results show that such behavior is shared by other functions of independent random variables. Concentration inequalities can be sorted according to how much information about the random variable is needed in order to use them. Markov's inequality Let X be a random variable that is non-negative (almost surely). Then, for every constant a > 0, : \Pr(X \geq a) \leq \frac. Note the following extension to Markov's inequality: if \Phi is a strictly increasing and non-negative function, then :\Pr(X \geq a) = \Pr(\Phi (X) \geq \Phi (a)) \leq \frac. Cheb ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Sergei Bernstein
Sergei Natanovich Bernstein (russian: Серге́й Ната́нович Бернште́йн, sometimes Romanized as ; 5 March 1880 – 26 October 1968) was a Ukrainian and Russian mathematician of Jewish origin known for contributions to partial differential equations, differential geometry, probability theory, and approximation theory. Work Partial differential equations In his doctoral dissertation, submitted in 1904 to Sorbonne, Bernstein solved Hilbert's nineteenth problem on the analytic solution of elliptic differential equations. His later work was devoted to Dirichlet's boundary problem for non-linear equations of elliptic type, where, in particular, he introduced a priori estimates. Probability theory In 1917, Bernstein suggested the first axiomatic foundation of probability theory, based on the underlying algebraic structure. It was later superseded by the measure-theoretic approach of Kolmogorov. In the 1920s, he introduced a method for proving limit theorems ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Bernstein Inequalities (probability Theory)
In probability theory, Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let ''X''1, ..., ''X''''n'' be independent Bernoulli random variables taking values +1 and −1 with probability 1/2 (this distribution is also known as the Rademacher distribution), then for every positive \varepsilon, :\mathbb\left (\left, \frac\sum_^n X_i\ > \varepsilon \right ) \leq 2\exp \left (-\frac \right). Bernstein inequalities were proved and published by Sergei Bernstein in the 1920s and 1930s.J.V.Uspensky, "Introduction to Mathematical Probability", McGraw-Hill Book Company, 1937 Later, these inequalities were rediscovered several times in various forms. Thus, special cases of the Bernstein inequalities are also known as the Chernoff bound, Hoeffding's inequality and Azuma's inequality. Some of the inequalities 1. Let X_1, \ldots, X_n be independent zero-mean random variables. Suppose that , X_i, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Exponential Decay
A quantity is subject to exponential decay if it decreases at a rate proportional to its current value. Symbolically, this process can be expressed by the following differential equation, where is the quantity and (lambda) is a positive rate called the exponential decay constant, disintegration constant, rate constant, or transformation constant: :\frac = -\lambda N. The solution to this equation (see derivation below) is: :N(t) = N_0 e^, where is the quantity at time , is the initial quantity, that is, the quantity at time . Measuring rates of decay Mean lifetime If the decaying quantity, ''N''(''t''), is the number of discrete elements in a certain set, it is possible to compute the average length of time that an element remains in the set. This is called the mean lifetime (or simply the lifetime), where the exponential time constant, \tau, relates to the decay rate constant, λ, in the following way: :\tau = \frac. The mean lifetime can be looked at as a ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Randomized Algorithm
A randomized algorithm is an algorithm that employs a degree of randomness as part of its logic or procedure. The algorithm typically uses uniformly random bits as an auxiliary input to guide its behavior, in the hope of achieving good performance in the "average case" over all possible choices of random determined by the random bits; thus either the running time, or the output (or both) are random variables. One has to distinguish between algorithms that use the random input so that they always terminate with the correct answer, but where the expected running time is finite (Las Vegas algorithms, for example Quicksort), and algorithms which have a chance of producing an incorrect result (Monte Carlo algorithms, for example the Monte Carlo algorithm for the MFAS problem) or fail to produce a result either by signaling a failure or failing to terminate. In some cases, probabilistic algorithms are the only practical means of solving a problem. In common practice, randomized algor ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Doob Martingale
In the probability theory, mathematical theory of probability, a Doob martingale (named after Joseph L. Doob, also known as a Levy martingale) is a stochastic process that approximates a given random variable and has the Martingale (probability theory), martingale property with respect to the given filtration (probability theory), filtration. It may be thought of as the evolving sequence of best approximations to the random variable based on information accumulated up to a certain time. When analyzing sums, random walks, or other additive functions of Statistical independence, independent random variables, one can often apply the central limit theorem, law of large numbers, Chernoff's inequality, Chebyshev's inequality or similar tools. When analyzing similar objects where the differences are not independent, the main tools are Martingale (probability theory), martingales and Azuma's inequality. Definition Let Y be any random variable with \mathbb[, Y, ] < \infty. Suppose ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Doob Martingale
In the probability theory, mathematical theory of probability, a Doob martingale (named after Joseph L. Doob, also known as a Levy martingale) is a stochastic process that approximates a given random variable and has the Martingale (probability theory), martingale property with respect to the given filtration (probability theory), filtration. It may be thought of as the evolving sequence of best approximations to the random variable based on information accumulated up to a certain time. When analyzing sums, random walks, or other additive functions of Statistical independence, independent random variables, one can often apply the central limit theorem, law of large numbers, Chernoff's inequality, Chebyshev's inequality or similar tools. When analyzing similar objects where the differences are not independent, the main tools are Martingale (probability theory), martingales and Azuma's inequality. Definition Let Y be any random variable with \mathbb[, Y, ] < \infty. Suppose ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Hoeffding's Lemma
In probability theory, Hoeffding's lemma is an inequality that bounds the moment-generating function of any bounded random variable. It is named after the Finnish–American mathematical statistician Wassily Hoeffding. The proof of Hoeffding's lemma uses Taylor's theorem and Jensen's inequality. Hoeffding's lemma is itself used in the proof of McDiarmid's inequality. Statement of the lemma Let ''X'' be any real-valued random variable such that a \leq X \leq b almost surely, i.e. with probability one. Then, for all \lambda \in \mathbb, :\mathbb \left e^ \right\leq \exp \Big(\lambda\mathbb \frac \Big), or equivalently, :\mathbb \left e^ \right\leq \exp \Big(\frac \Big). Proof Without loss of generality, by replacing X by X - \mathbb /math>, we can assume \mathbb = 0, so that a \leq 0 \leq b. Since e^ is a convex function of x, we have that for all x \in ,b/math>, :e^\leq \frace^+\frace^ So, :\begin \mathbb\left ^\right&\leq \frace^+\frace^\\ &= \frace^ + \frace ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Chernoff Bound
In probability theory, the Chernoff bound gives exponentially decreasing bounds on tail distributions of sums of independent random variables. Despite being named after Herman Chernoff, the author of the paper it first appeared in, the result is due to Herman Rubin. It is a sharper bound than the first- or second-moment-based tail bounds such as Markov's inequality or Chebyshev's inequality, which only yield power-law bounds on tail decay. However, the Chernoff bound requires the variates to be independent, a condition that is not required by either Markov's inequality or Chebyshev's inequality (although Chebyshev's inequality does require the variates to be pairwise independent). The Chernoff bound is related to the Bernstein inequalities, which were developed earlier, and to Hoeffding's inequality. The generic bound The generic Chernoff bound for a random variable is attained by applying Markov's inequality to . This gives a bound in terms of the moment-generating function ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]