HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
, an indecomposable distribution is a
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon i ...
that cannot be represented as the distribution of the sum of two or more non-constant
independent Independent or Independents may refer to: Arts, entertainment, and media Artist groups * Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s * Independ ...
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s: ''Z'' ≠ ''X'' + ''Y''. If it can be so expressed, it is decomposable: ''Z'' = ''X'' + ''Y''. If, further, it can be expressed as the distribution of the sum of two or more independent ''identically'' distributed random variables, then it is divisible: ''Z'' = ''X''1 + ''X''2.


Examples


Indecomposable

* The simplest examples are Bernoulli-distributeds: if ::X = \begin 1 & \text p, \\ 0 & \text 1-p, \end :then the probability distribution of ''X'' is indecomposable. :Proof: Given non-constant distributions ''U'' and ''V,'' so that ''U'' assumes at least two values ''a'', ''b'' and ''V'' assumes two values ''c'', ''d,'' with ''a'' < ''b'' and ''c'' < ''d'', then ''U'' + ''V'' assumes at least three distinct values: ''a'' + ''c'', ''a'' + ''d'', ''b'' + ''d'' (''b'' + ''c'' may be equal to ''a'' + ''d'', for example if one uses 0, 1 and 0, 1). Thus the sum of non-constant distributions assumes at least three values, so the Bernoulli distribution is not the sum of non-constant distributions. * Suppose ''a'' + ''b'' + ''c'' = 1, ''a'', ''b'', ''c'' ≥ 0, and ::X = \begin 2 & \text a, \\ 1 & \text b, \\ 0 & \text c. \end :This probability distribution is decomposable (as the distribution of the sum of two Bernoulli-distributed random variables) if ::\sqrt + \sqrt \le 1 \ :and otherwise indecomposable. To see, this, suppose ''U'' and ''V'' are independent random variables and ''U'' + ''V'' has this probability distribution. Then we must have :: \begin U = \begin 1 & \text p, \\ 0 & \text 1 - p, \end & \mbox & V = \begin 1 & \text q, \\ 0 & \text 1 - q, \end \end :for some ''p'', ''q'' ∈  , 1 by similar reasoning to the Bernoulli case (otherwise the sum ''U'' + ''V'' will assume more than three values). It follows that ::a = pq, \, ::c = (1-p)(1-q), \, ::b = 1 - a - c. \, :This system of two quadratic equations in two variables ''p'' and ''q'' has a solution (''p'', ''q'') ∈  , 1sup>2 if and only if ::\sqrt + \sqrt \le 1. \ :Thus, for example, the
discrete uniform distribution In probability theory and statistics, the discrete uniform distribution is a symmetric probability distribution wherein a finite number of values are equally likely to be observed; every one of ''n'' values has equal probability 1/''n''. Anothe ...
on the set is indecomposable, but the
binomial distribution In probability theory and statistics, the binomial distribution with parameters ''n'' and ''p'' is the discrete probability distribution of the number of successes in a sequence of ''n'' independent experiments, each asking a yes–no quest ...
for two trials each having probabilities 1/2, thus giving respective probabilities ''a, b, c'' as 1/4, 1/2, 1/4, is decomposable. * An
absolutely continuous In calculus, absolute continuity is a smoothness property of functions that is stronger than continuity and uniform continuity. The notion of absolute continuity allows one to obtain generalizations of the relationship between the two central oper ...
indecomposable distribution. It can be shown that the distribution whose
density function In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can ...
is ::f(x) = x^2 e^ :is indecomposable.


Decomposable

* All
infinitely divisible Infinite divisibility arises in different ways in philosophy, physics, economics, order theory (a branch of mathematics), and probability theory (also a branch of mathematics). One may speak of infinite divisibility, or the lack thereof, of matter, ...
distributions are
a fortiori ''Argumentum a fortiori'' (literally "argument from the stronger
eason Eason is a surname. The name comes from Aythe where the first recorded spelling of the family name is that of Aythe Filius Thome which was dated circa 1630, in the "Baillie of Stratherne". Aythe ''filius'' Thome received a charter of the lands of F ...
) (, ) is a form of Argumentation theory, argumentation that draws upon existing confidence in a proposition to argue in favor of a second proposition that is held to be Logi ...
decomposable; in particular, this includes the
stable distribution In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be stab ...
s, such as the
normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu ...
. * The uniform distribution on the interval , 1is decomposable, since it is the sum of the Bernoulli variable that assumes 0 or 1/2 with equal probabilities and the uniform distribution on , 1/2 Iterating this yields the infinite decomposition: :: \sum_^\infty , :where the independent random variables ''X''''n'' are each equal to 0 or 1 with equal probabilities – this is a Bernoulli trial of each digit of the binary expansion. * A sum of indecomposable random variables is decomposable into the original summands. But it may turn out to be
infinitely divisible Infinite divisibility arises in different ways in philosophy, physics, economics, order theory (a branch of mathematics), and probability theory (also a branch of mathematics). One may speak of infinite divisibility, or the lack thereof, of matter, ...
. Suppose a random variable ''Y'' has a
geometric distribution In probability theory and statistics, the geometric distribution is either one of two discrete probability distributions: * The probability distribution of the number ''X'' of Bernoulli trials needed to get one success, supported on the set \; * ...
::\Pr(Y = n) = (1-p)^n p\, :on . :For any positive integer ''k'', there is a sequence of negative-binomially distributed random variables ''Y''''j'', ''j'' = 1, ..., ''k'', such that ''Y''1 + ... + ''Y''''k'' has this geometric distribution. Therefore, this distribution is infinitely divisible. :On the other hand, let ''D''''n'' be the ''n''th binary digit of ''Y'', for ''n'' ≥ 0. Then the ''D''''n'''s are independent and :: Y = \sum_{n=1}^\infty 2^n D_n, :and each term in this sum is indecomposable.


Related concepts

At the other extreme from indecomposability is infinite divisibility. * Cramér's theorem shows that while the normal distribution is infinitely divisible, it can only be decomposed into normal distributions. *
Cochran's theorem In statistics, Cochran's theorem, devised by William G. Cochran, is a theorem used to justify results relating to the probability distributions of statistics that are used in the analysis of variance. Statement Let ''U''1, ..., ''U'N'' be i.i. ...
shows that the terms in a decomposition of a sum of squares of normal random variables into sums of squares of linear combinations of these variables always have independent
chi-squared distribution In probability theory and statistics, the chi-squared distribution (also chi-square or \chi^2-distribution) with k degrees of freedom is the distribution of a sum of the squares of k independent standard normal random variables. The chi-squa ...
s.


See also

* Cramér's theorem *
Cochran's theorem In statistics, Cochran's theorem, devised by William G. Cochran, is a theorem used to justify results relating to the probability distributions of statistics that are used in the analysis of variance. Statement Let ''U''1, ..., ''U'N'' be i.i. ...
*
Infinite divisibility (probability) In probability theory, a probability distribution is infinitely divisible if it can be expressed as the probability distribution of the sum of an arbitrary number of independent and identically distributed (i.i.d.) random variables. The characteri ...
* Khinchin's theorem on the factorization of distributions


References

* Linnik, Yu. V. and Ostrovskii, I. V. ''Decomposition of random variables and vectors'', Amer. Math. Soc., Providence RI, 1977. * Lukacs, Eugene, ''Characteristic Functions'', New York, Hafner Publishing Company, 1970. Types of probability distributions