HOME

TheInfoList



OR:

In mathematics, the Paley–Zygmund inequality bounds the probability that a positive random variable is small, in terms of its first two moments. The inequality was proved by
Raymond Paley Raymond Edward Alan Christopher Paley (7 January 1907 – 7 April 1933) was an English mathematician who made significant contributions to mathematical analysis before dying young in a skiing accident. Life Paley was born in Bournemouth, Eng ...
and
Antoni Zygmund Antoni Zygmund (December 25, 1900 – May 30, 1992) was a Polish mathematician. He worked mostly in the area of mathematical analysis, including especially harmonic analysis, and he is considered one of the greatest analysts of the 20th century. ...
. Theorem: If ''Z'' ≥ 0 is a
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the p ...
with finite variance, and if 0 \le \theta \le 1, then : \operatorname( Z > \theta\operatorname ) \ge (1-\theta)^2 \frac. Proof: First, : \operatorname = \operatorname Z \, \mathbf_ + \operatorname Z \, \mathbf_ The first addend is at most \theta \operatorname /math>, while the second is at most \operatorname ^2 \operatorname( Z > \theta\operatorname ^ by the
Cauchy–Schwarz inequality The Cauchy–Schwarz inequality (also called Cauchy–Bunyakovsky–Schwarz inequality) is considered one of the most important and widely used inequalities in mathematics. The inequality for sums was published by . The corresponding inequality f ...
. The desired inequality then follows. ∎


Related inequalities

The Paley–Zygmund inequality can be written as : \operatorname( Z > \theta \operatorname ) \ge \frac. This can be improved. By the
Cauchy–Schwarz inequality The Cauchy–Schwarz inequality (also called Cauchy–Bunyakovsky–Schwarz inequality) is considered one of the most important and widely used inequalities in mathematics. The inequality for sums was published by . The corresponding inequality f ...
, : \operatorname - \theta \operatorname[Z \le \operatorname[ (Z - \theta \operatorname \mathbf_ ">.html" ;"title=" - \theta \operatorname[Z"> - \theta \operatorname[Z \le \operatorname[ (Z - \theta \operatorname \mathbf_ \le \operatorname[ (Z - \theta \operatorname ^2 ]^ \operatorname( Z > \theta \operatorname )^ which, after rearranging, implies that : \operatorname(Z > \theta \operatorname \ge \frac = \frac. This inequality is sharp; equality is achieved if Z almost surely equals a positive constant. In turn, this implies another convenient form (known as Cantelli's inequality) which is : \operatorname(Z > \mu - \theta \sigma) \ge \frac, where \mu=\operatorname /math> and \sigma^2 = \operatorname /math>. This follows from the substitution \theta = 1-\theta'\sigma/\mu valid when 0\le \mu - \theta \sigma\le\mu. A strengthened form of the Paley-Zygmund inequality states that if Z is a non-negative random variable then : \operatorname( Z > \theta \operatorname \mid Z > 0) \ge \frac for every 0 \leq \theta \leq 1 . This inequality follows by applying the usual Paley-Zygmund inequality to the conditional distribution of Z given that it is positive and noting that the various factors of \operatorname(Z>0) cancel. Both this inequality and the usual Paley-Zygmund inequality also admit L^p versions: If Z is a non-negative random variable and p > 1 then : \operatorname( Z > \theta \operatorname \mid Z > 0) \ge \frac. for every 0 \leq \theta \leq 1 . This follows by the same proof as above but using
Hölder's inequality In mathematical analysis, Hölder's inequality, named after Otto Hölder, is a fundamental inequality between integrals and an indispensable tool for the study of spaces. :Theorem (Hölder's inequality). Let be a measure space and let with . ...
in place of the Cauchy-Schwarz inequality.


See also

* Cantelli's inequality *
Second moment method In mathematics, the second moment method is a technique used in probability theory and analysis to show that a random variable has positive probability of being positive. More generally, the "moment method" consists of bounding the probability th ...
* Concentration inequality – a summary of tail-bounds on random variables.


References


Further reading

* * {{DEFAULTSORT:Paley-Zygmund inequality Probabilistic inequalities