Cantelli's inequality
   HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
, Cantelli's inequality (also called the Chebyshev-Cantelli inequality and the one-sided Chebyshev inequality) is an improved version of
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
for one-sided tail bounds. The inequality states that, for \lambda > 0, : \Pr(X-\mathbb ge\lambda) \le \frac, where :X is a real-valued random variable, :\Pr is the probability measure, :\mathbb /math> is the expected value of X, :\sigma^2 is the
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbe ...
of X. Applying the Cantelli inequality to -X gives a bound on the lower tail, : \Pr(X-\mathbb le -\lambda) \le \frac. While the inequality is often attributed to Francesco Paolo Cantelli who published it in 1928, it originates in Chebyshev's work of 1874.Ghosh, B.K., 2002. Probability inequalities related to Markov's theorem. ''The American Statistician'', 56(3), pp.186-190
/ref> When bounding the event random variable deviates from its
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value (magnitude and sign) of a given data set. For a data set, the '' ari ...
in only one direction (positive or negative), Cantelli's inequality gives an improvement over Chebyshev's inequality. The Chebyshev inequality has "higher moments versions" and "vector versions", and so does the Cantelli inequality.


Comparison to Chebyshev's inequality

For one-sided tail bounds, Cantelli's inequality is better, since Chebyshev's inequality can only get : \Pr(X - \mathbb \geq \lambda) \leq \Pr(, X-\mathbb \ge\lambda) \le \frac. On the other hand, for two-sided tail bounds, Cantelli's inequality gives : \Pr(, X-\mathbb \ge\lambda) = \Pr(X-\mathbb ge\lambda) + \Pr(X-\mathbb le-\lambda) \le \frac, which is always worse than Chebyshev's inequality (when \lambda \geq \sigma; otherwise, both inequalities bound a probability by a value greater than one, and so are trivial).


Proof

Let X be a real-valued random variable with finite variance \sigma^2 and expectation \mu, and define Y = X - \mathbb /math> (so that \mathbb = 0 and \operatorname(Y) = \sigma^2). Then, for any u\geq 0, we have : \Pr( X-\mathbb geq\lambda) = \Pr( Y \geq \lambda) = \Pr( Y + u \geq \lambda + u) \leq \Pr( (Y + u)^2 \geq (\lambda + u)^2 ) \leq \frac = \frac. the last inequality being a consequence of
Markov's inequality In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant. It is named after the Russian mathematician Andrey Markov, ...
. As the above holds for any choice of u\in\mathbb, we can choose to apply it with the value that minimizes the function u \geq 0 \mapsto \frac. By differentiating, this can be seen to be u_\ast = \frac, leading to : \Pr( X-\mathbb \geq\lambda) \leq \frac = \frac if \lambda > 0


Generalizations

Using more moments, various stronger inequalities can be shown. He, Zhang, and Zhang showed when \mathbb 0 and \mathbb ^21: : \Pr(X\ge\lambda) \le 1- (2\sqrt-3)\frac


See also

*
Chebyshev's inequality In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
*
Paley–Zygmund inequality In mathematics, the Paley–Zygmund inequality bounds the probability that a positive random variable is small, in terms of its first two moment (mathematics), moments. The inequality was proved by Raymond Paley and Antoni Zygmund. Theorem: If ''Z' ...


References

{{reflist Probabilistic inequalities