Popoviciu's inequality on variances
   HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
, Popoviciu's inequality, named after
Tiberiu Popoviciu Tiberiu Popoviciu (February 16, 1906–October 29, 1975) was a Romanian mathematician and the namesake of Popoviciu's inequality and Popoviciu's inequality on variances. The Tiberiu Popoviciu High School of Computer Science in Cluj-Napoca is ...
, is an
upper bound In mathematics, particularly in order theory, an upper bound or majorant of a subset of some preordered set is an element of that is greater than or equal to every element of . Dually, a lower bound or minorant of is defined to be an element ...
on the
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers ...
''σ''2 of any bounded
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon i ...
. Let ''M'' and ''m'' be upper and lower bounds on the values of any
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
with a particular probability distribution. Then Popoviciu's inequality states: : \sigma^2 \le \frac14 ( M - m )^2. This equality holds precisely when half of the probability is concentrated at each of the two bounds. Sharma ''et al''. have sharpened Popoviciu's inequality: : \le \frac14 (M - m)^2. Popoviciu's inequality is weaker than the Bhatia–Davis inequality which states : \sigma^2 \le ( M - \mu )( \mu - m ) where ''μ'' is the expectation of the random variable. In the case of an independent sample of ''n'' observations from a bounded probability distribution, the von Szokefalvi Nagy inequality gives a lower bound to the variance of the sample mean: : \sigma^2 \ge \frac .


Proof via the Bhatia–Davis inequality

Let A be a random variable with mean \mu, variance \sigma^2, and \Pr(m \leq A \leq M) = 1. Then, since m \leq A \leq M, 0 \leq \mathbb M - A)(A - m)= -\mathbb ^2- m M + (m+M)\mu. Thus, \sigma^2 = \mathbb ^2- \mu^2 \leq - m M + (m+M)\mu - \mu^2 = (M - \mu) (\mu - m). Now, applying the
Inequality of arithmetic and geometric means In mathematics, the inequality of arithmetic and geometric means, or more briefly the AM–GM inequality, states that the arithmetic mean of a list of non-negative real numbers is greater than or equal to the geometric mean of the same list; and ...
, ab \leq \left( \frac \right)^2, with a = M - \mu and b = \mu - m, yields the desired result: \sigma^2 \leq (M - \mu) (\mu - m) \leq \frac.


References

Theory of probability distributions Statistical inequalities Statistical deviation and dispersion {{probability-stub