HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
, de Finetti's theorem states that exchangeable observations are conditionally independent relative to some
latent variable In statistics, latent variables (from Latin: present participle of ''lateo'', “lie hidden”) are variables that can only be inferred indirectly through a mathematical model from other observable variables that can be directly observed or me ...
. An epistemic probability
distribution Distribution may refer to: Mathematics *Distribution (mathematics), generalized functions used to formulate solutions of partial differential equations *Probability distribution, the probability of a particular value or value range of a varia ...
could then be assigned to this variable. It is named in honor of Bruno de Finetti. For the special case of an exchangeable sequence of Bernoulli random variables it states that such a sequence is a " mixture" of sequences of
independent and identically distributed In probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is usua ...
(i.i.d.) Bernoulli random variables. A sequence of random variables is called exchangeable if the joint distribution of the sequence is unchanged by any permutation of the indices. While the variables of the exchangeable sequence are not ''themselves'' independent, only exchangeable, there is an ''underlying'' family of i.i.d. random variables. That is, there are underlying, generally unobservable, quantities that are i.i.d. – exchangeable sequences are mixtures of i.i.d. sequences.


Background

A Bayesian statistician often seeks the conditional probability distribution of a random quantity given the data. The concept of exchangeability was introduced by de Finetti. De Finetti's theorem explains a mathematical relationship between independence and exchangeability. An infinite sequence :X_1, X_2, X_3, \dots of random variables is said to be exchangeable if for any
natural number In mathematics, the natural numbers are those numbers used for counting (as in "there are ''six'' coins on the table") and ordering (as in "this is the ''third'' largest city in the country"). Numbers used for counting are called '' cardinal ...
''n'' and any finite sequence ''i''1, ..., ''i''''n'' and any permutation of the sequence π: → , :(X_,\dots,X_) \text (X_,\dots,X_) both have the same
joint probability distribution Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considere ...
. If an identically distributed sequence is
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independe ...
, then the sequence is exchangeable; however, the converse is false—there exist exchangeable random variables that are not statistically independent, for example the Pólya urn model.


Statement of the theorem

A
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the p ...
''X'' has a
Bernoulli distribution In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli,James Victor Uspensky: ''Introduction to Mathematical Probability'', McGraw-Hill, New York 1937, page 45 is the discrete probab ...
if Pr(''X'' = 1) = ''p'' and Pr(''X'' = 0) = 1 − ''p'' for some ''p'' ∈ (0, 1). De Finetti's theorem states that the probability distribution of any infinite exchangeable sequence of Bernoulli random variables is a " mixture" of the probability distributions of independent and identically distributed sequences of Bernoulli random variables. "Mixture", in this sense, means a weighted average, but this need not mean a finite or countably infinite (i.e., discrete) weighted average: it can be an integral rather than a sum. More precisely, suppose ''X''1, ''X''2, ''X''3, ... is an infinite exchangeable sequence of Bernoulli-distributed random variables. Then there is some probability distribution ''m'' on the interval , 1and some random variable ''Y'' such that * The probability distribution of ''Y'' is ''m'', and * The
conditional probability distribution In probability theory and statistics, given two jointly distributed random variables X and Y, the conditional probability distribution of Y given X is the probability distribution of Y when X is known to be a particular value; in some cases the c ...
of the whole sequence ''X''1, ''X''2, ''X''3, ... given the value of ''Y'' is described by saying that ** ''X''1, ''X''2, ''X''3, ... are conditionally independent given ''Y'', and ** For any ''i'' ∈ , the conditional probability that ''X''''i'' = 1, given the value of ''Y'', is ''Y''.


Another way of stating the theorem

Suppose X_1,X_2,X_3,\ldots is an infinite exchangeable sequence of Bernoulli random variables. Then X_1,X_2,X_3,\ldots are conditionally independent and identically distributed given the exchangeable sigma-algebra (i.e., the sigma-algebra of events is measurable with respect to X_1,X_2,\ldots and invariant under finite permutations of the indices).


Example

Here is a concrete example. We construct a sequence :X_1, X_2, X_3, \dots of random variables, by "mixing" two i.i.d. sequences as follows. We assume ''p'' = 2/3 with probability 1/2 and ''p'' = 9/10 with probability 1/2. Given the event ''p'' = 2/3, the conditional distribution of the sequence is that the ''X''i are independent and identically distributed and ''X''1 = 1 with probability 2/3 and ''X''1 = 0 with probability 1 − 2/3. Given the event ''p'' = 9/10, the conditional distribution of the sequence is that the ''X''i are independent and identically distributed and ''X''1 = 1 with probability 9/10 and ''X''1 = 0 with probability 1 − 9/10. This can be interpreted as follows: Make two biased coins, one showing "heads" with 2/3 probability and one showing "heads" with 9/10 probability. Flip a fair coin once to decide which biased coin to use for all flips that are recorded. Here "heads" at flip i means Xi=1. The independence asserted here is ''conditional'' independence, i.e. the Bernoulli random variables in the sequence are conditionally independent given the event that ''p'' = 2/3, and are conditionally independent given the event that ''p'' = 9/10. But they are not unconditionally independent; they are positively
correlated In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statisti ...
. In view of the strong law of large numbers, we can say that :\lim_ \frac = \begin 2/3 & \text1/2, \\ 9/10 & \text1/2. \end Rather than concentrating probability 1/2 at each of two points between 0 and 1, the "mixing distribution" can be any
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomeno ...
supported on the interval from 0 to 1; which one it is depends on the joint distribution of the infinite sequence of Bernoulli random variables. The definition of exchangeability, and the statement of the theorem, also makes sense for finite length sequences :X_1,\dots, X_n, but the theorem is not generally true in that case. It is true if the sequence can be extended to an exchangeable sequence that is infinitely long. The simplest example of an exchangeable sequence of Bernoulli random variables that cannot be so extended is the one in which ''X''1 = 1 − ''X''2 and ''X''1 is either 0 or 1, each with probability 1/2. This sequence is exchangeable, but cannot be extended to an exchangeable sequence of length 3, let alone an infinitely long one.


Extensions

Versions of de Finetti's theorem for ''finite'' exchangeable sequences, and for ''Markov exchangeable'' sequences have been proved by Diaconis and Freedman and by Kerns and Szekely. Two notions of partial exchangeability of arrays, known as ''separate'' and ''joint exchangeability'' lead to extensions of de Finetti's theorem for arrays by Aldous and Hoover. The computable de Finetti theorem shows that if an exchangeable sequence of real random variables is given by a computer program, then a program which samples from the mixing measure can be automatically recovered. In the setting of free probability, there is a noncommutative extension of de Finetti's theorem which characterizes noncommutative sequences invariant under quantum permutations. Extensions of de Finetti's theorem to quantum states have been found to be useful in
quantum information Quantum information is the information of the state of a quantum system. It is the basic entity of study in quantum information theory, and can be manipulated using quantum information processing techniques. Quantum information refers to both t ...
, in topics like quantum key distribution and entanglement detection.


See also

* Choquet theory * Hewitt–Savage zero–one law * Krein–Milman theorem


References


External links

*{{SpringerEOM, id=De_Finetti_theorem, first=L., last= Accardi, title=De Finetti theorem
What is so cool about De Finetti's representation theorem?
Probability theorems Bayesian statistics Integral representations