HOME

TheInfoList



OR:

In
statistics Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, and specifically in the study of the
Dirichlet distribution In probability and statistics, the Dirichlet distribution (after Peter Gustav Lejeune Dirichlet), often denoted \operatorname(\boldsymbol\alpha), is a family of continuous multivariate probability distributions parameterized by a vector \boldsymb ...
, a neutral vector of
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s is one that exhibits a particular type of
statistical independence Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes. Two events are independent, statistically independent, or stochastically independent if, informally speaking, the occurrence of ...
amongst its elements. In particular, when elements of the random vector must add up to certain sum, then an element in the vector is neutral with respect to the others if the distribution of the vector created by expressing the remaining elements as proportions of their total is independent of the element that was omitted.


Definition

A single element X_i of a random vector X_1,X_2,\ldots,X_k is neutral if the ''relative'' proportions of all the other elements are independent of X_i. Formally, consider the vector of random variables :X=(X_1,\ldots,X_k) where :\sum_^k X_i=1. The values X_i are interpreted as lengths whose sum is unity. In a variety of contexts, it is often desirable to eliminate a proportion, say X_1, and consider the distribution of the remaining intervals within the remaining length. The first element of X, viz X_1 is defined as ''neutral'' if X_1 is
statistically independent Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes. Two events are independent, statistically independent, or stochastically independent if, informally speaking, the occurrence of o ...
of the vector : X^*_1 = \left( \frac, \frac, \ldots, \frac \right). Variable X_2 is neutral if X_2/(1-X_1) is independent of the remaining interval: that is, X_2/(1-X_1) being independent of :X^*_ = \left( \frac, \frac, \ldots, \frac \right). Thus X_2, viewed as the first element of Y = (X_2,X_3,\ldots,X_k) , is neutral. In general, variable X_j is neutral if X_1,\ldots X_ is independent of :X^*_ = \left( \frac, \ldots, \frac \right).


Complete neutrality

A vector for which each element is neutral is completely neutral. If X = (X_1, \ldots, X_K)\sim\operatorname(\alpha) is drawn from a Dirichlet distribution, then X is completely neutral. In 1980, James and Mosimann showed that the Dirichlet distribution is characterised by neutrality.


See also

*
Generalized Dirichlet distribution In statistics, the generalized Dirichlet distribution (GD) is a generalization of the Dirichlet distribution with a more general covariance structure and almost twice the number of parameters. Random vectors with a GD distribution are completely ...


References

{{reflist Theory of probability distributions Independence (probability theory)