Paley–Zygmund Inequality
   HOME

TheInfoList



OR:

In
mathematics Mathematics is a field of study that discovers and organizes methods, Mathematical theory, theories and theorems that are developed and Mathematical proof, proved for the needs of empirical sciences and mathematics itself. There are many ar ...
, the Paley–Zygmund inequality bounds the probability that a positive random variable is small, in terms of its first two moments. The inequality was proved by
Raymond Paley Raymond Edward Alan Christopher Paley (7 January 1907 – 7 April 1933) was an England, English mathematician who made significant contributions to mathematical analysis before dying young in a skiing accident. Life Paley was born in Bournemou ...
and Antoni Zygmund. Theorem: If ''Z'' ≥ 0 is a
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a Mathematics, mathematical formalization of a quantity or object which depends on randomness, random events. The term 'random variable' in its mathema ...
with finite variance, and if 0 \le \theta \le 1, then : \operatorname( Z > \theta\operatorname ) \ge (1-\theta)^2 \frac. Proof: First, : \operatorname = \operatorname Z \, \mathbf_ + \operatorname Z \, \mathbf_ The first addend is at most \theta \operatorname /math>, while the second is at most \operatorname ^2 \operatorname( Z > \theta\operatorname ^ by the
Cauchy–Schwarz inequality The Cauchy–Schwarz inequality (also called Cauchy–Bunyakovsky–Schwarz inequality) is an upper bound on the absolute value of the inner product between two vectors in an inner product space in terms of the product of the vector norms. It is ...
. The desired inequality then follows. ∎


Related inequalities

The Paley–Zygmund inequality can be written as : \operatorname( Z > \theta \operatorname ) \ge \frac. This can be improved. By the
Cauchy–Schwarz inequality The Cauchy–Schwarz inequality (also called Cauchy–Bunyakovsky–Schwarz inequality) is an upper bound on the absolute value of the inner product between two vectors in an inner product space in terms of the product of the vector norms. It is ...
, : \operatorname - \theta \operatorname[Z \le \operatorname[ (Z - \theta \operatorname \mathbf_ ">.html" ;"title=" - \theta \operatorname[Z"> - \theta \operatorname[Z \le \operatorname[ (Z - \theta \operatorname \mathbf_ \le \operatorname[ (Z - \theta \operatorname ^2 ]^ \operatorname( Z > \theta \operatorname )^ which, after rearranging, implies that : \operatorname(Z > \theta \operatorname \ge \frac = \frac. This inequality is sharp; equality is achieved if Z almost surely equals a positive constant. In turn, this implies another convenient form (known as
Cantelli's inequality In probability theory, Cantelli's inequality (also called the Chebyshev-Cantelli inequality and the one-sided Chebyshev inequality) is an improved version of Chebyshev's inequality for one-sided tail bounds. The inequality states that, for \lambda ...
) which is : \operatorname(Z > \mu - \theta \sigma) \ge \frac, where \mu=\operatorname /math> and \sigma^2 = \operatorname /math>. This follows from the substitution \theta = 1-\theta'\sigma/\mu valid when 0\le \mu - \theta \sigma\le\mu. A strengthened form of the Paley-Zygmund inequality states that if Z is a non-negative random variable then : \operatorname( Z > \theta \operatorname \mid Z > 0) \ge \frac for every 0 \leq \theta \leq 1 . This inequality follows by applying the usual Paley-Zygmund inequality to the conditional distribution of Z given that it is positive and noting that the various factors of \operatorname(Z>0) cancel. Both this inequality and the usual Paley-Zygmund inequality also admit L^p versions: If Z is a non-negative random variable and p > 1 then : \operatorname( Z > \theta \operatorname \mid Z > 0) \ge \frac. for every 0 \leq \theta \leq 1 . This follows by the same proof as above but using
Hölder's inequality In mathematical analysis, Hölder's inequality, named after Otto Hölder, is a fundamental inequality (mathematics), inequality between Lebesgue integration, integrals and an indispensable tool for the study of Lp space, spaces. The numbers an ...
in place of the Cauchy-Schwarz inequality.


See also

*
Cantelli's inequality In probability theory, Cantelli's inequality (also called the Chebyshev-Cantelli inequality and the one-sided Chebyshev inequality) is an improved version of Chebyshev's inequality for one-sided tail bounds. The inequality states that, for \lambda ...
* Second moment method * Concentration inequality – a summary of tail-bounds on random variables.


References


Further reading

* * {{DEFAULTSORT:Paley-Zygmund inequality Probabilistic inequalities