HOME





Azuma's Inequality
In probability theory, the Azuma–Hoeffding inequality (named after Kazuoki Azuma and Wassily Hoeffding) gives a concentration result for the values of martingales that have bounded differences. Suppose \ is a martingale (or super-martingale) and :, X_k - X_, \leq c_k, \, almost surely. Then for all positive integers ''N'' and all positive reals ''\epsilon'', :\text(X_N - X_0 \geq \epsilon) \leq \exp\left ( \right). And symmetrically (when ''X''''k'' is a sub-martingale): :\text(X_N - X_0 \leq -\epsilon) \leq \exp\left ( \right). If ''X'' is a martingale, using both inequalities above and applying the union bound allows one to obtain a two-sided bound: :\text(, X_N - X_0, \geq \epsilon) \leq 2\exp\left ( \right). Proof The proof shares similar idea of the proof for the general form of Azuma's inequality listed below. Actually, this can be viewed as a direct corollary of the general form of Azuma's inequality. A general form of Azuma's inequality Limitation of ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability Theory
Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms of probability, axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure (mathematics), measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event (probability theory), event. Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of determinism, non-deterministic or uncertain processes or measured Quantity, quantities that may either be single occurrences or evolve over time in a random fashion). Although it is no ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Doob Decomposition Theorem
In the theory of stochastic processes in discrete time, a part of the mathematical theory of probability, the Doob decomposition theorem gives a unique decomposition of every adapted and integrable stochastic process as the sum of a martingale and a predictable process (or "drift") starting at zero. The theorem was proved by and is named for Joseph L. Doob. The analogous theorem in the continuous-time case is the Doob–Meyer decomposition theorem. Statement Let (\Omega, \mathcal, \mathbb) be a probability space, with N \in \N or I = \N_0 a finite or countably infinite index set, (\mathcal_n)_ a filtration of \mathcal, and an adapted stochastic process with for all . Then there exist a martingale and an integrable predictable process starting with such that for every . Here predictable means that is \mathcal_-measurable for every . This decomposition is almost surely unique. Remark The theorem is valid word for word also for stochastic processes taking values i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Tôhoku Mathematical Journal
The ''Tohoku Mathematical Journal'' is a mathematical research journal published by Tohoku University in Japan. It was founded in August 1911 by Tsuruichi Hayashi. History Due to World War II the publication of the journal stopped in 1943 with volume 49. Publication was resumed in 1949 with the volume numbering starting again at 1. In order to distinguish between the identical numbered volumes, volumes in the first publishing period are referred to as the ''first series'' whereas the later volumes are called ''second series''. Before volume 51 of the second series the journal was called ''Tôhoku Mathematical Journal'', with a circumflex over the second letter of ''Tohoku''. Selected papers *. The first publication of the Sprague–Grundy theorem, the basis for much of combinatorial game theory, later independently rediscovered by P. M. Grundy. *. This paper describes Weiszfeld's algorithm for finding the geometric median. *. This paper, often referred to as " The Tohoku pap ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Concentration Inequality
In chemistry, concentration is the abundance of a constituent divided by the total volume of a mixture. Several types of mathematical description can be distinguished: '' mass concentration'', ''molar concentration'', ''number concentration'', and ''volume concentration''. The concentration can refer to any kind of chemical mixture, but most frequently refers to solutes and solvents in solutions. The molar (amount) concentration has variants, such as normal concentration and osmotic concentration. Dilution is reduction of concentration, e.g. by adding solvent to a solution. The verb to concentrate means to increase concentration, the opposite of dilute. Etymology ''Concentration-'', ''concentratio'', action or an act of coming together at a single place, bringing to a common center, was used in post-classical Latin in 1550 or earlier, similar terms attested in Italian (1589), Spanish (1589), English (1606), French (1632). Qualitative description Often in informal, non-techn ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Sergei Bernstein
Sergei Natanovich Bernstein (, sometimes Romanized as ; 5 March 1880 – 26 October 1968) was a Ukrainian and Soviet mathematician of Jewish origin known for contributions to partial differential equations, differential geometry, probability theory, and approximation theory. Life Bernstein was born into the Jewish family of prominent Ukrainian physiologist Nathaniel Bernstein in Odessa. Sergei was brought up in Odessa but his father died on 4 February 1891 just before he was eleven years old. He graduated from high school in 1898. After this, following his mother's wishes, he went with his elder sister to Paris. Bernstein's sister studied biology in Paris and did not return to Ukraine but worked at the Pasteur Institute. After one year studying mathematics at the Sorbonne, Bernstein decided that he would rather become an engineer and entered the École supérieure d'électricité. However, he continued to be interested in mathematics and spent three terms at the University of G ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Bernstein Inequalities (probability Theory)
In probability theory, Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let ''X''1, ..., ''X''''n'' be independent Bernoulli random variables taking values +1 and −1 with probability 1/2 (this distribution is also known as the Rademacher distribution), then for every positive \varepsilon, :\mathbb\left (\left, \frac\sum_^n X_i\ > \varepsilon \right ) \leq 2\exp \left (-\frac \right). Bernstein inequalities were proven and published by Sergei Bernstein in the 1920s and 1930s.J.V.Uspensky, "Introduction to Mathematical Probability", McGraw-Hill Book Company, 1937 Later, these inequalities were rediscovered several times in various forms. Thus, special cases of the Bernstein inequalities are also known as the Chernoff bound, Hoeffding's inequality and Azuma's inequality. The martingale case of the Bernstein inequality is known as Freedman's inequality and its refinement is k ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Exponential Decay
A quantity is subject to exponential decay if it decreases at a rate proportional to its current value. Symbolically, this process can be expressed by the following differential equation, where is the quantity and (lambda Lambda (; uppercase , lowercase ; , ''lám(b)da'') is the eleventh letter of the Greek alphabet, representing the voiced alveolar lateral approximant . In the system of Greek numerals, lambda has a value of 30. Lambda is derived from the Phoen ...) is a positive rate called the exponential decay constant, disintegration constant, rate constant, or transformation constant: :\frac = -\lambda N(t). The solution to this equation (see #Solution_of_the_differential_equation, derivation below) is: :N(t) = N_0 e^, where is the quantity at time , is the initial quantity, that is, the quantity at time . Measuring rates of decay Mean lifetime If the decaying quantity, ''N''(''t''), is the number of discrete elements in a certain set (mathematics), se ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Randomized Algorithm
A randomized algorithm is an algorithm that employs a degree of randomness as part of its logic or procedure. The algorithm typically uses uniformly random bits as an auxiliary input to guide its behavior, in the hope of achieving good performance in the "average case" over all possible choices of random determined by the random bits; thus either the running time, or the output (or both) are random variables. There is a distinction between algorithms that use the random input so that they always terminate with the correct answer, but where the expected running time is finite (Las Vegas algorithms, for example Quicksort), and algorithms which have a chance of producing an incorrect result ( Monte Carlo algorithms, for example the Monte Carlo algorithm for the MFAS problem) or fail to produce a result either by signaling a failure or failing to terminate. In some cases, probabilistic algorithms are the only practical means of solving a problem. In common practice, randomized alg ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Doob Martingale
In the mathematical theory of probability, a Doob martingale (named after Joseph L. Doob, also known as a Levy martingale) is a stochastic process that approximates a given random variable and has the martingale property with respect to the given filtration. It may be thought of as the evolving sequence of best approximations to the random variable based on information accumulated up to a certain time. When analyzing sums, random walks, or other additive functions of independent random variables, one can often apply the central limit theorem, law of large numbers, Chernoff's inequality, Chebyshev's inequality or similar tools. When analyzing similar objects where the differences are not independent, the main tools are martingales and Azuma's inequality In probability theory, the Azuma–Hoeffding inequality (named after Kazuoki Azuma and Wassily Hoeffding) gives a concentration result for the values of martingales that have bounded differences. Suppose \ is a martingal ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Hoeffding's Lemma
In probability theory, Hoeffding's lemma is an inequality that bounds the moment-generating function of any bounded random variable, implying that such variables are subgaussian. It is named after the Finnish– American mathematical statistician Wassily Hoeffding. The proof of Hoeffding's lemma uses Taylor's theorem and Jensen's inequality. Hoeffding's lemma is itself used in the proof of Hoeffding's inequality as well as the generalization McDiarmid's inequality. Statement Let ''X'' be any real-valued random variable such that a \leq X \leq b almost surely, i.e. with probability one. Then, for all \lambda \in \mathbb, :\mathbb \left e^ \right\leq \exp \Big(\lambda\mathbb \frac \Big), or equivalently, :\mathbb \left e^ \right\leq \exp \Big(\frac \Big). Proof The following proof is direct but somewhat ad-hoc. Another proof with a slightly worse constant are also available using symmetrization. Statement This statement and proof uses the language of subgaus ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Chernoff Bound
In probability theory, a Chernoff bound is an exponentially decreasing upper bound on the tail of a random variable based on its moment generating function. The minimum of all such exponential bounds forms ''the'' Chernoff or Chernoff-Cramér bound, which may decay faster than exponential (e.g. sub-Gaussian). It is especially useful for sums of independent random variables, such as sums of Bernoulli random variables. The bound is commonly named after Herman Chernoff who described the method in a 1952 paper, though Chernoff himself attributed it to Herman Rubin. In 1938 Harald Cramér had published an almost identical concept now known as Cramér's theorem. It is a sharper bound than the first- or second-moment-based tail bounds such as Markov's inequality or Chebyshev's inequality, which only yield power-law bounds on tail decay. However, when applied to sums the Chernoff bound requires the random variables to be independent, a condition that is not required by either Markov's i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]