Bhatia–Davis Inequality
   HOME

TheInfoList



OR:

In mathematics, the Bhatia–Davis inequality, named after Rajendra Bhatia and
Chandler Davis Horace Chandler Davis (August 12, 1926 – September 24, 2022) was an American-Canadian mathematician, writer, educator, and left-wing political activist. The socialist magazine ''Jacobin'' described Davis as "an internationally esteemed mathemati ...
, is an
upper bound In mathematics, particularly in order theory, an upper bound or majorant of a subset of some preordered set is an element of that is every element of . Dually, a lower bound or minorant of is defined to be an element of that is less ...
on the
variance In probability theory and statistics, variance is the expected value of the squared deviation from the mean of a random variable. The standard deviation (SD) is obtained as the square root of the variance. Variance is a measure of dispersion ...
''σ''2 of any bounded
probability distribution In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
on the real line.


Statement

Let ''m'' and M be the lower and upper bounds, respectively, for a set of real numbers ''a1'', ..., ''an ,'' with a particular
probability distribution In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
. Let ''μ'' be the
expected value In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first Moment (mathematics), moment) is a generalization of the weighted average. Informa ...
of this distribution. Then the Bhatia–Davis inequality states: : \sigma^2 \le (M - \mu)(\mu - m). \, Equality holds if and only if every ''aj'' in the set of values is equal either to ''M'' or to ''m''.


Proof

Since m \leq A \leq M, 0 \leq \mathbb M - A)(A - m)= -\mathbb ^2- m M + (m+M)\mu. Thus, \sigma^2 = \mathbb ^2- \mu^2 \leq - m M + (m+M)\mu - \mu^2 = (M - \mu) (\mu - m).


Extensions of the Bhatia–Davis inequality

If \Phi is a positive and unital linear mapping of a ''C* -''algebra \mathcal into a ''C*'' -algebra \mathcal, and ''A'' is a self-adjoint element of \mathcal satisfying ''m'' \leq ''A'' \leq ''M'', then: \Phi (A^2)-(\Phi A)^2\leq (M-\Phi A)(\Phi A - m). If \mathit is a discrete random variable such that P (X=x_i)=p_i, where i = 1, ..., n, then: s_p^2=\sum_^n p_ix_i^2-(\sum_^n p_ix_i)^2\leq(M-\sum_^n p_ix_i)(\sum_^n p_ix_i-m), where 0\leq p_i \leq1 and \sum_^n p_i=1.


Comparisons to other inequalities

The Bhatia–Davis inequality is stronger than
Popoviciu's inequality on variances In probability theory, Popoviciu's inequality, named after Tiberiu Popoviciu, is an upper bound on the variance ''σ''2 of any bounded probability distribution. Let ''M'' and ''m'' be upper and lower bounds on the values of any random variable ...
(note, however, that Popoviciu's inequality does not require knowledge of the expectation or mean), as can be seen from the conditions for equality. Equality holds in Popoviciu's inequality if and only if half of the ''aj'' are equal to the upper bounds and half of the ''aj'' are equal to the lower bounds. Additionally, Sharma has made further refinements on the Bhatia–Davis inequality.


See also

*
Cramér–Rao bound In estimation theory and statistics, the Cramér–Rao bound (CRB) relates to estimation of a deterministic (fixed, though unknown) parameter. The result is named in honor of Harald Cramér and Calyampudi Radhakrishna Rao, but has also been d ...
* Chapman–Robbins bound *
Popoviciu's inequality on variances In probability theory, Popoviciu's inequality, named after Tiberiu Popoviciu, is an upper bound on the variance ''σ''2 of any bounded probability distribution. Let ''M'' and ''m'' be upper and lower bounds on the values of any random variable ...


References

Statistical inequalities Theory of probability distributions {{probability-stub