Kolmogorov inequality
   HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
, Kolmogorov's inequality is a so-called "maximal
inequality Inequality may refer to: Economics * Attention inequality, unequal distribution of attention across users, groups of people, issues in etc. in attention economy * Economic inequality, difference in economic well-being between population groups * ...
" that gives a bound on the probability that the
partial sum In mathematics, a series is, roughly speaking, a description of the operation of adding infinitely many quantities, one after the other, to a given starting quantity. The study of series is a major part of calculus and its generalization, math ...
s of a
finite Finite is the opposite of infinite. It may refer to: * Finite number (disambiguation) * Finite set, a set whose cardinality (number of elements) is some natural number * Finite verb, a verb form that has a subject, usually being inflected or marked ...
collection of
independent random variables Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independ ...
exceed some specified bound.


Statement of the inequality

Let ''X''1, ..., ''X''''n'' : Ω → R be
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independ ...
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s defined on a common
probability space In probability theory, a probability space or a probability triple (\Omega, \mathcal, P) is a mathematical construct that provides a formal model of a random process or "experiment". For example, one can define a probability space which models t ...
(Ω, ''F'', Pr), with
expected value In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a l ...
E 'X''''k''nbsp;= 0 and
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers ...
Var 'X''''k''nbsp;< +∞ for ''k'' = 1, ..., ''n''. Then, for each λ > 0, :\Pr \left(\max_ , S_k , \geq\lambda\right)\leq \frac \operatorname _n\equiv \frac\sum_^n \operatorname _k\frac\sum_^\text _k^2 where ''S''''k'' = ''X''1 + ... + ''X''''k''. The convenience of this result is that we can bound the worst case deviation of a
random walk In mathematics, a random walk is a random process that describes a path that consists of a succession of random steps on some mathematical space. An elementary example of a random walk is the random walk on the integer number line \mathbb Z ...
at any point of time using its value at the end of time interval.


Proof

The following argument employs discrete martingales. As argued in the discussion of
Doob's martingale inequality In mathematics, Doob's martingale inequality, also known as Kolmogorov’s submartingale inequality is a result in the study of stochastic processes. It gives a bound on the probability that a submartingale exceeds any given value over a given inter ...
, the sequence S_1, S_2, \dots, S_n is a martingale. Define (Z_i)_^n as follows. Let Z_0 = 0, and :Z_ = \left\{ \begin{array}{ll} S_{i+1} & \text{ if } \displaystyle \max_{1 \leq j \leq i} , S_j , < \lambda \\ Z_i & \text{ otherwise} \end{array} \right. for all i. Then (Z_i)_{i=0}^n is also a martingale. For any martingale M_i with M_0 = 0, we have that \begin{align} \sum_{i=1}^n \text{E} (M_i - M_{i-1})^2&= \sum_{i=1}^n \text{E} M_i^2 - 2 M_i M_{i-1} + M_{i-1}^2 \\ &= \sum_{i=1}^n \text{E}\left M_i^2 - 2 (M_{i-1} + M_{i} - M_{i-1}) M_{i-1} + M_{i-1}^2 \right\\ &= \sum_{i=1}^n \text{E}\left M_i^2 - M_{i-1}^2 \right- 2\text{E}\left M_{i-1} (M_{i}-M_{i-1})\right\ &= \text{E} _n^2- \text{E} _0^2= \text{E} _n^2 \end{align} Applying this result to the martingale (S_i)_{i=0}^n, we have \begin{align} \text{Pr}\left( \max_{1 \leq i \leq n} , S_i, \geq \lambda\right) &= \text{Pr} _n^2=\frac{1}{\lambda^2}_\sum_{i=1}^n_\text{E} Z_i_-_Z_{i-1})^2\\ &\leq_\frac{1}{\lambda^2}_\sum_{i=1}^n_\text{E} S_i_-_S_{i-1})^2=\frac{1}{\lambda^2}_\text{E} _n^2=_\frac{1}{\lambda^2}_\text{Var} _n\end{align} where_the_first_inequality_follows_by_
Chebyshev's_inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
. This_inequality_was_generalized_by_Hájek_and_Rényi_in_1955.


_See_also

*_
Chebyshev's_inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
*_
Etemadi's_inequality In probability theory, Etemadi's inequality is a so-called "maximal inequality", an inequality that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound. The result ...
*_
Landau–Kolmogorov_inequality In mathematics, the Landau–Kolmogorov inequality, named after Edmund Landau and Andrey Kolmogorov, is the following family of interpolation inequalities between different derivatives of a function ''f'' defined on a subset ''T'' of the real ...
*_
Markov's_inequality In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function (mathematics), function of a random variable is greater than or equal to some positive Constant (mathematics), constant. It is named a ...
*_
Bernstein_inequalities_(probability_theory) In probability theory, Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let ''X''1, ..., ''X'n'' be independent Bernoulli random variables taking value ...


_References

*__(Theorem_22.4) *_ *_ {{PlanetMath_attribution, id=3687, title=Kolmogorov's_inequality Stochastic_processes Probabilistic_inequalities Articles_containing_proofshtml" ;"title="Z_n, \geq \lambda] \\ &\leq \frac{1}{\lambda^2} \text{E} _n^2=\frac{1}{\lambda^2} \sum_{i=1}^n \text{E} Z_i - Z_{i-1})^2\\ &\leq \frac{1}{\lambda^2} \sum_{i=1}^n \text{E} S_i - S_{i-1})^2=\frac{1}{\lambda^2} \text{E} _n^2= \frac{1}{\lambda^2} \text{Var} _n\end{align} where the first inequality follows by
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
. This inequality was generalized by Hájek and Rényi in 1955.


See also

*
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
*
Etemadi's inequality In probability theory, Etemadi's inequality is a so-called "maximal inequality", an inequality that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound. The result ...
*
Landau–Kolmogorov inequality In mathematics, the Landau–Kolmogorov inequality, named after Edmund Landau and Andrey Kolmogorov, is the following family of interpolation inequalities between different derivatives of a function ''f'' defined on a subset ''T'' of the real ...
*
Markov's inequality In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function (mathematics), function of a random variable is greater than or equal to some positive Constant (mathematics), constant. It is named a ...
*
Bernstein inequalities (probability theory) In probability theory, Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let ''X''1, ..., ''X'n'' be independent Bernoulli random variables taking value ...


References

* (Theorem 22.4) * * {{PlanetMath attribution, id=3687, title=Kolmogorov's inequality Stochastic processes Probabilistic inequalities Articles containing proofs>Z_n, \geq \lambda\\ &\leq \frac{1}{\lambda^2} \text{E} _n^2=\frac{1}{\lambda^2} \sum_{i=1}^n \text{E} Z_i - Z_{i-1})^2\\ &\leq \frac{1}{\lambda^2} \sum_{i=1}^n \text{E} S_i - S_{i-1})^2=\frac{1}{\lambda^2} \text{E} _n^2= \frac{1}{\lambda^2} \text{Var} _n\end{align} where the first inequality follows by
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
. This inequality was generalized by Hájek and Rényi in 1955.


See also

*
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
*
Etemadi's inequality In probability theory, Etemadi's inequality is a so-called "maximal inequality", an inequality that gives a bound on the probability that the partial sums of a finite collection of independent random variables exceed some specified bound. The result ...
*
Landau–Kolmogorov inequality In mathematics, the Landau–Kolmogorov inequality, named after Edmund Landau and Andrey Kolmogorov, is the following family of interpolation inequalities between different derivatives of a function ''f'' defined on a subset ''T'' of the real ...
*
Markov's inequality In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function (mathematics), function of a random variable is greater than or equal to some positive Constant (mathematics), constant. It is named a ...
*
Bernstein inequalities (probability theory) In probability theory, Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let ''X''1, ..., ''X'n'' be independent Bernoulli random variables taking value ...


References

* (Theorem 22.4) * * {{PlanetMath attribution, id=3687, title=Kolmogorov's inequality Stochastic processes Probabilistic inequalities Articles containing proofs