The order in probability notation is used in
probability theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
and
statistical theory
The theory of statistics provides a basis for the whole range of techniques, in both study design and data analysis, that are used within applications of statistics.
The theory covers approaches to statistical-decision problems and to statistica ...
in direct parallel to the
big-O notation
Big ''O'' notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. Big O is a member of a family of notations invented by Paul Bachmann, Edmund Lan ...
that is standard in
mathematics. Where the
big-O notation
Big ''O'' notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. Big O is a member of a family of notations invented by Paul Bachmann, Edmund Lan ...
deals with the convergence of sequences or sets of ordinary numbers, the order in probability notation deals with
convergence of sets of random variables, where convergence is in the sense of
convergence in probability
In probability theory, there exist several different notions of convergence of random variables. The convergence of sequences of random variables to some limit random variable is an important concept in probability theory, and its applications t ...
.
Definitions
Small O: convergence in probability
For a set of random variables ''X
n'' and a corresponding set of constants ''a
n'' (both indexed by ''n'', which need not be discrete), the notation
:
means that the set of values ''X
n''/''a
n'' converges to zero in probability as ''n'' approaches an appropriate limit.
Equivalently, ''X''
''n'' = o
''p''(''a''
''n'') can be written as ''X''
''n''/''a''
''n'' = o
''p''(1),
where ''X''
''n'' = o
''p''(1) is defined as,
:
for every positive ε.
[ Yvonne M. Bishop, Stephen E.Fienberg, Paul W. Holland. (1975,2007) ''Discrete multivariate analysis'', Springer. , ]
Big O: stochastic boundedness
The notation
:
means that the set of values ''X
n''/''a
n'' is stochastically bounded. That is, for any ''ε'' > 0, there exists a finite ''M'' > 0 and a finite ''N'' > 0 such that
:
Comparison of the two definitions
The difference between the definition is subtle. If one uses the definition of the limit, one gets:
* Big O
''p''(1):
* Small o
''p''(1):
The difference lies in the δ: for stochastic boundedness, it suffices that there exists one (arbitrary large) δ to satisfy the inequality, and δ is allowed to be dependent on ε (hence the δ
ε). On the other side, for convergence, the statement has to hold not only for one, but for any (arbitrary small) δ. In a sense, this means that the sequence must be bounded, with a bound that gets smaller as the sample size increases.
This suggests that if a sequence is o
''p''(1), then it is O
''p''(1), i.e. convergence in probability implies stochastic boundedness. But the reverse does not hold.
Example
If
is a stochastic sequence such that each element has finite variance, then
:
(see Theorem 14.4-1 in Bishop et al.)
If, moreover,
is a null sequence for a sequence
of real numbers, then
converges to zero in probability by
Chebyshev's inequality
In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from th ...
, so
:
References
{{DEFAULTSORT:Big O In Probability Notation
Mathematical notation
Probability theory
Statistical theory
Convergence (mathematics)