Paley–Zygmund Inequality
   HOME

TheInfoList



OR:

In
mathematics Mathematics is an area of knowledge that includes the topics of numbers, formulas and related structures, shapes and the spaces in which they are contained, and quantities and their changes. These topics are represented in modern mathematics ...
, the Paley–Zygmund inequality bounds the probability that a positive random variable is small, in terms of its first two moments. The inequality was proved by
Raymond Paley Raymond Edward Alan Christopher Paley (7 January 1907 – 7 April 1933) was an English mathematician who made significant contributions to mathematical analysis before dying young in a skiing accident. Life Paley was born in Bournemouth, Engl ...
and
Antoni Zygmund Antoni Zygmund (December 25, 1900 – May 30, 1992) was a Polish mathematician. He worked mostly in the area of mathematical analysis, including especially harmonic analysis, and he is considered one of the greatest analysts of the 20th century. ...
. Theorem: If ''Z'' ≥ 0 is a
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
with finite variance, and if 0 \le \theta \le 1, then : \operatorname( Z > \theta\operatorname ) \ge (1-\theta)^2 \frac. Proof: First, : \operatorname = \operatorname Z \, \mathbf_ + \operatorname Z \, \mathbf_ The first addend is at most \theta \operatorname /math>, while the second is at most \operatorname ^2 \operatorname( Z > \theta\operatorname ^ by the
Cauchy–Schwarz inequality The Cauchy–Schwarz inequality (also called Cauchy–Bunyakovsky–Schwarz inequality) is considered one of the most important and widely used inequalities in mathematics. The inequality for sums was published by . The corresponding inequality fo ...
. The desired inequality then follows. ∎


Related inequalities

The Paley–Zygmund inequality can be written as : \operatorname( Z > \theta \operatorname ) \ge \frac. This can be improved. By the
Cauchy–Schwarz inequality The Cauchy–Schwarz inequality (also called Cauchy–Bunyakovsky–Schwarz inequality) is considered one of the most important and widely used inequalities in mathematics. The inequality for sums was published by . The corresponding inequality fo ...
, : \operatorname _-_\theta_\operatorname[Z \le_\operatorname[_(Z_-_\theta_\operatorname__\mathbf__.html" ;"title=".html" ;"title=" - \theta \operatorname[Z"> - \theta \operatorname[Z \le \operatorname[ (Z - \theta \operatorname \mathbf_ ">.html" ;"title=" - \theta \operatorname[Z"> - \theta \operatorname[Z \le \operatorname[ (Z - \theta \operatorname \mathbf_ \le \operatorname[ (Z - \theta \operatorname ^2 ]^ \operatorname( Z > \theta \operatorname )^ which, after rearranging, implies that : \operatorname(Z > \theta \operatorname \ge \frac = \frac. This inequality is sharp; equality is achieved if Z almost surely equals a positive constant. In turn, this implies another convenient form (known as Cantelli's inequality) which is : \operatorname(Z > \mu - \theta \sigma) \ge \frac, where \mu=\operatorname /math> and \sigma^2 = \operatorname /math>. This follows from the substitution \theta = 1-\theta'\sigma/\mu valid when 0\le \mu - \theta \sigma\le\mu. A strengthened form of the Paley-Zygmund inequality states that if Z is a non-negative random variable then : \operatorname( Z > \theta \operatorname \mid Z > 0) \ge \frac for every 0 \leq \theta \leq 1 . This inequality follows by applying the usual Paley-Zygmund inequality to the conditional distribution of Z given that it is positive and noting that the various factors of \operatorname(Z>0) cancel. Both this inequality and the usual Paley-Zygmund inequality also admit L^p versions: If Z is a non-negative random variable and p > 1 then : \operatorname( Z > \theta \operatorname \mid Z > 0) \ge \frac. for every 0 \leq \theta \leq 1 . This follows by the same proof as above but using Hölder's inequality in place of the Cauchy-Schwarz inequality.


See also

* Cantelli's inequality * Second moment method *
Concentration inequality In probability theory, concentration inequalities provide bounds on how a random variable deviates from some value (typically, its expected value). The law of large numbers of classical probability theory states that sums of independent random vari ...
– a summary of tail-bounds on random variables.


References


Further reading

* * {{DEFAULTSORT:Paley-Zygmund inequality Probabilistic inequalities