Bhatia–Davis inequality
   HOME

TheInfoList



OR:

In mathematics, the Bhatia–Davis inequality, named after
Rajendra Bhatia Rajendra Bhatia (born 1952) is an Indian mathematician, author, and educator. He is currently a professor of mathematics at Ashoka University located in Sonipat, Haryana ,India. Education He studied at the University of Delhi, where he compl ...
and
Chandler Davis Horace Chandler Davis (August 12, 1926 – September 24, 2022) was an American-Canadian mathematician, writer, educator, and political activist: "an internationally esteemed mathematician, a minor science fiction writer of note, and among the mos ...
, is an
upper bound In mathematics, particularly in order theory, an upper bound or majorant of a subset of some preordered set is an element of that is greater than or equal to every element of . Dually, a lower bound or minorant of is defined to be an eleme ...
on the
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbe ...
''σ''2 of any bounded probability distribution on the real line.


Statement

Let ''m'' and M be the lower and upper bounds, respectively, for a set of real numbers ''a1'', ..., ''an ,'' with a particular probability distribution. Let ''μ'' be the expected value of this distribution. Then the Bhatia–Davis inequality states: : \sigma^2 \le (M - \mu)(\mu - m). \, Equality holds if and only if every ''aj'' in the set of values is equal either to ''M'' or to ''m''.


Proof

Since m \leq A \leq M, 0 \leq \mathbb M - A)(A - m)= -\mathbb ^2- m M + (m+M)\mu. Thus, \sigma^2 = \mathbb ^2- \mu^2 \leq - m M + (m+M)\mu - \mu^2 = (M - \mu) (\mu - m).


Extensions of the Bhatia–Davis inequality

If \Phi is a positive and unital linear mapping of a ''C* -''algebra \mathcal into a ''C*'' -algebra \mathcal, and ''A'' is a self-adjoint element of \mathcal satisfying ''m'' \leq ''A'' \leq ''M'', then: \Phi (A^2)-(\Phi A)^2\leq (M-\Phi A)(\Phi A - m). If \mathit is a discrete random variable such that P (X=x_i)=p_i, where i = 1, ..., n, then: s_p^2=\sum_^n p_ix_i^2-(\sum_^n p_ix_i)^2\leq(M-\sum_^n p_ix_i)(\sum_^n p_ix_i-m), where 0\leq p_i \leq1 and \sum_^n p_i=1.


Comparisons to other inequalities

The Bhatia–Davis inequality is stronger than Popoviciu's inequality on variances (note, however, that Popoviciu's inequality does not require knowledge of the expectation or mean), as can be seen from the conditions for equality. Equality holds in Popoviciu's inequality if and only if half of the ''aj'' are equal to the upper bounds and half of the ''aj'' are equal to the lower bounds. Additionally, Sharma has made further refinements on the Bhatia–Davis inequality.


See also

*
Cramér–Rao bound In estimation theory and statistics, the Cramér–Rao bound (CRB) expresses a lower bound on the variance of unbiased estimators of a deterministic (fixed, though unknown) parameter, the variance of any such estimator is at least as high as the ...
* Chapman–Robbins bound * Popoviciu's inequality on variances


References

Statistical inequalities Theory of probability distributions {{probability-stub