In
probability theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
and
statistics, the marginal distribution of a
subset of a
collection of
random variables is the
probability distribution of the variables contained in the subset. It gives the probabilities of various values of the variables in the subset without reference to the values of the other variables. This contrasts with a
conditional distribution, which gives the probabilities contingent upon the values of the other variables.
Marginal variables are those variables in the subset of variables being retained. These concepts are "marginal" because they can be found by summing values in a table along rows or columns, and writing the sum in the margins of the table. The distribution of the marginal variables (the marginal distribution) is obtained by marginalizing (that is, focusing on the sums in the margin) over the distribution of the variables being discarded, and the discarded variables are said to have been marginalized out.
The context here is that the theoretical studies being undertaken, or the
data analysis being done, involves a wider set of random variables but that attention is being limited to a reduced number of those variables. In many applications, an analysis may start with a given collection of random variables, then first extend the set by defining new ones (such as the sum of the original random variables) and finally reduce the number by placing interest in the marginal distribution of a subset (such as the sum). Several different analyses may be done, each treating a different subset of variables as the marginal variables.
Definition
Marginal probability mass function
Given a known
joint distribution
Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considered ...
of two discrete
random variables, say, and , the marginal distribution of either variable – for example — is the
probability distribution of when the values of are not taken into consideration. This can be calculated by summing the
joint probability
Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considered ...
distribution over all values of . Naturally, the converse is also true: the marginal distribution can be obtained for by summing over the separate values of .
:
, and
A marginal probability can always be written as an
expected value:
Intuitively, the marginal probability of ''X'' is computed by examining the conditional probability of ''X'' given a particular value of ''Y'', and then averaging this conditional probability over the distribution of all values of ''Y''.
This follows from the definition of
expected value (after applying the
law of the unconscious statistician In probability theory and statistics, the law of the unconscious statistician, or LOTUS, is a theorem used to calculate the expected value of a function ''g''(''X'') of a random variable ''X'' when one knows the probability distribution of ''X'' but ...
)
Therefore, marginalization provides the rule for the transformation of the probability distribution of a random variable ''Y'' and another random variable :
Marginal probability density function
Given two continuous
random variables ''X'' and ''Y'' whose
joint distribution
Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considered ...
is known, then the marginal
probability density function
In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) ca ...
can be obtained by integrating the
joint probability
Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considered ...
distribution, , over ''Y,'' and vice versa. That is
:
and
where