HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
, Eaton's inequality is a bound on the largest values of a linear combination of bounded
random variables A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
. This inequality was described in 1974 by Morris L. Eaton.Eaton, Morris L. (1974) "A probability inequality for linear combinations of bounded random variables." ''Annals of Statistics'' 2(3) 609–614


Statement of the inequality

Let be a set of real independent random variables, each with an
expected value In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a l ...
of zero and bounded above by 1 ( , ''X''''i'' , ≤ 1, for 1 ≤ ''i'' ≤ ''n''). The variates do not have to be identically or symmetrically distributed. Let be a set of ''n'' fixed real numbers with : \sum_^n a_i^2 = 1 . Eaton showed that : P\left( \left, \sum_^n a_i X_i \ \ge k \right) \le 2 \inf_ \int_c^\infty \left( \frac \right)^3 \phi( z ) \, dz = 2 B_E( k ) , where ''φ''(''x'') is the
probability density function In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can ...
of the standard normal distribution. A related bound is Edelman's : P\left( \left, \sum_^n a_i X_i \ \ge k \right) \le 2 \left( 1 - \Phi\left k - \frac \right\right) = 2 B_( k ) , where Φ(''x'') is
cumulative distribution function In probability theory and statistics, the cumulative distribution function (CDF) of a real-valued random variable X, or just distribution function of X, evaluated at x, is the probability that X will take a value less than or equal to x. Ev ...
of the standard normal distribution. Pinelis has shown that Eaton's bound can be sharpened:Pinelis, I. (1994) "Extremal probabilistic problems and Hotelling's ''T''2 test under a symmetry condition." ''Annals of Statistics'' 22(1), 357–368 : B_ = \min\ A set of critical values for Eaton's bound have been determined.Dufour, J-M; Hallin, M (1993) "Improved Eaton bounds for linear combinations of bounded random variables, with statistical applications", ''Journal of the American Statistical Association'', 88(243) 1026–1033


Related inequalities

Let be a set of independent Rademacher random variables – ''P''( ''a''''i'' = 1 ) = ''P''( ''a''''i'' = −1 ) = 1/2. Let ''Z'' be a normally distributed variate with a
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value (magnitude and sign) of a given data set. For a data set, the ''arithme ...
0 and
variance In probability theory and statistics, variance is the expectation of the squared deviation of a random variable from its population mean or sample mean. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers ...
of 1. Let be a set of ''n'' fixed real numbers such that : \sum_^n b_i^2 = 1 . This last condition is required by the Riesz–Fischer theorem which states that : a_i b_i + \cdots + a_n b_n will converge if and only if : \sum_^n b_i^2 is finite. Then : E f( a_i b_i + \cdots + a_n b_n ) \le E f( Z ) for ''f''(x) = , x , p. The case for ''p'' ≥ 3 was proved by WhittleWhittle P (1960) Bounds for the moments of linear and quadratic forms in independent variables. Teor Verojatnost i Primenen 5: 331–335 MR0133849 and ''p'' ≥ 2 was proved by Haagerup.Haagerup U (1982) The best constants in the Khinchine inequality. Studia Math 70: 231–283 MR0654838 If ''f''(x) = ''e''λx with ''λ'' ≥ 0 then : E f( a_i b_i + \cdots + a_n b_n ) \le \inf \left \frac \right= e^ where ''inf'' is the infimum.Hoeffding W (1963) Probability inequalities for sums of bounded random variables. J Amer Statist Assoc 58: 13–30 MR144363 Let : S_n = a_i b_i + \cdots + a_n b_n ThenPinelis I (1994) Optimum bounds for the distributions of martingales in Banach spaces. Ann Probab 22(4):1679–1706 : P( S_n \ge x ) \le \frac P( Z \ge x ) The constant in the last inequality is approximately 4.4634. An alternative bound is also known:de la Pena, VH, Lai TL, Shao Q (2009) Self normalized processes. Springer-Verlag, New York : P( S_n \ge x ) \le e^ This last bound is related to the Hoeffding's inequality. In the uniform case where all the ''b''''i'' = ''n''−1/2 the maximum value of ''S''''n'' is ''n''1/2. In this case van Zuijlen has shown thatvan Zuijlen Martien CA (2011) On a conjecture concerning the sum of independent Rademacher random variables. https://arxiv.org/abs/1112.4988 : P( , \mu - \sigma , ) \le 0.5 \, where ''μ'' is the
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value (magnitude and sign) of a given data set. For a data set, the ''arithme ...
and ''σ'' is the
standard deviation In statistics, the standard deviation is a measure of the amount of variation or dispersion of a set of values. A low standard deviation indicates that the values tend to be close to the mean (also called the expected value) of the set, while ...
of the sum.


References

{{reflist Probabilistic inequalities Statistical inequalities