HOME

TheInfoList



OR:

In
probability theory Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
, Cantelli's inequality (also called the Chebyshev-Cantelli inequality and the one-sided Chebyshev inequality) is an improved version of Chebyshev's inequality for one-sided tail bounds. The inequality states that, for \lambda > 0, : \Pr(X-\mathbb ge\lambda) \le \frac, where :X is a real-valued
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
, :\Pr is the
probability measure In mathematics, a probability measure is a real-valued function defined on a set of events in a σ-algebra that satisfies Measure (mathematics), measure properties such as ''countable additivity''. The difference between a probability measure an ...
, :\mathbb /math> is the
expected value In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first Moment (mathematics), moment) is a generalization of the weighted average. Informa ...
of X, :\sigma^2 is the
variance In probability theory and statistics, variance is the expected value of the squared deviation from the mean of a random variable. The standard deviation (SD) is obtained as the square root of the variance. Variance is a measure of dispersion ...
of X. Applying the Cantelli inequality to -X gives a bound on the lower tail, : \Pr(X-\mathbb le -\lambda) \le \frac. While the inequality is often attributed to Francesco Paolo Cantelli who published it in 1928, it originates in Chebyshev's work of 1874. When bounding the event random variable deviates from its
mean A mean is a quantity representing the "center" of a collection of numbers and is intermediate to the extreme values of the set of numbers. There are several kinds of means (or "measures of central tendency") in mathematics, especially in statist ...
in only one direction (positive or negative), Cantelli's inequality gives an improvement over Chebyshev's inequality. The Chebyshev inequality has "higher moments versions" and "vector versions", and so does the Cantelli inequality.


Comparison to Chebyshev's inequality

For one-sided tail bounds, Cantelli's inequality is better, since Chebyshev's inequality can only get : \Pr(X - \mathbb \geq \lambda) \leq \Pr(, X-\mathbb \ge\lambda) \le \frac. On the other hand, for two-sided tail bounds, Cantelli's inequality gives : \Pr(, X-\mathbb \ge\lambda) = \Pr(X-\mathbb ge\lambda) + \Pr(X-\mathbb le-\lambda) \le \frac, which is always worse than Chebyshev's inequality (when \lambda \geq \sigma; otherwise, both inequalities bound a probability by a value greater than one, and so are trivial).


Generalizations

Various stronger inequalities can be shown. He, Zhang, and Zhang showed (Corollary 2.3) when \mathbb 0,\,\mathbb ^21 and \lambda\ge0: : \Pr(X\ge\lambda) \le 1- (2\sqrt-3)\frac. In the case \lambda=0 this matches a bound in Berger's "The Fourth Moment Method", : \Pr(X\ge 0) \ge \frac. This improves over Cantelli's inequality in that we can get a non-zero lower bound, even when \mathbb 0.


See also

* Chebyshev's inequality * Paley–Zygmund inequality


References

{{reflist Probabilistic inequalities