In
probability theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
and
statistics
Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
, given two
jointly distributed random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s
and
, the conditional probability distribution of
given
is the
probability distribution
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon i ...
of
when
is known to be a particular value; in some cases the conditional probabilities may be expressed as functions containing the unspecified value
of
as a parameter. When both
and
are
categorical variable
In statistics, a categorical variable (also called qualitative variable) is a variable that can take on one of a limited, and usually fixed, number of possible values, assigning each individual or other unit of observation to a particular group or ...
s, a
conditional probability table
In statistics, the conditional probability table (CPT) is defined for a set of discrete and mutually dependent random variables to display conditional probabilities of a single variable with respect to the others (i.e., the probability of each po ...
is typically used to represent the conditional probability. The conditional distribution contrasts with the
marginal distribution
In probability theory and statistics, the marginal distribution of a subset of a collection of random variables is the probability distribution of the variables contained in the subset. It gives the probabilities of various values of the variables ...
of a random variable, which is its distribution without reference to the value of the other variable.
If the conditional distribution of
given
is a
continuous distribution
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon i ...
, then its
probability density function
In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can ...
is known as the conditional density function. The properties of a conditional distribution, such as the
moments, are often referred to by corresponding names such as the
conditional mean
In probability theory, the conditional expectation, conditional expected value, or conditional mean of a random variable is its expected value – the value it would take “on average” over an arbitrarily large number of occurrences – given ...
and
conditional variance In probability theory and statistics, a conditional variance is the variance of a random variable given the value(s) of one or more other variables.
Particularly in econometrics, the conditional variance is also known as the scedastic function or ...
.
More generally, one can refer to the conditional distribution of a subset of a set of more than two variables; this conditional distribution is contingent on the values of all the remaining variables, and if more than one variable is included in the subset then this conditional distribution is the conditional
joint distribution
Given two random variables that are defined on the same probability space, the joint probability distribution is the corresponding probability distribution on all possible pairs of outputs. The joint distribution can just as well be considered ...
of the included variables.
Conditional discrete distributions
For
discrete random variable
A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s, the conditional probability mass function of
given
can be written according to its definition as:
Due to the occurrence of
in the denominator, this is defined only for non-zero (hence strictly positive)
The relation with the probability distribution of
given
is:
:
Example
Consider the roll of a fair and let
if the number is even (i.e., 2, 4, or 6) and
otherwise. Furthermore, let
if the number is prime (i.e., 2, 3, or 5) and
otherwise.
Then the unconditional probability that
is 3/6 = 1/2 (since there are six possible rolls of the dice, of which three are even), whereas the probability that
conditional on
is 1/3 (since there are three possible prime number rolls—2, 3, and 5—of which one is even).
Conditional continuous distributions
Similarly for
continuous random variable
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon ...
s, the conditional
probability density function
In probability theory, a probability density function (PDF), or density of a continuous random variable, is a function whose value at any given sample (or point) in the sample space (the set of possible values taken by the random variable) can ...
of
given the occurrence of the value
of
can be written as
where
gives the
joint density of
and
, while
gives the
marginal density for
. Also in this case it is necessary that
.
The relation with the probability distribution of
given
is given by:
:
The concept of the conditional distribution of a continuous random variable is not as intuitive as it might seem:
Borel's paradox shows that conditional probability density functions need not be invariant under coordinate transformations.
Example
The graph shows a
bivariate normal joint density for random variables
and
. To see the distribution of
conditional on
, one can first visualize the line
in the
plane
Plane(s) most often refers to:
* Aero- or airplane, a powered, fixed-wing aircraft
* Plane (geometry), a flat, 2-dimensional surface
Plane or planes may also refer to:
Biology
* Plane (tree) or ''Platanus'', wetland native plant
* ''Planes' ...
, and then visualize the plane containing that line and perpendicular to the
plane. The intersection of that plane with the joint normal density, once rescaled to give unit area under the intersection, is the relevant conditional density of
.
Relation to independence
Random variables
,
are
independent
Independent or Independents may refer to:
Arts, entertainment, and media Artist groups
* Independents (artist group), a group of modernist painters based in the New Hope, Pennsylvania, area of the United States during the early 1930s
* Independ ...
if and only if the conditional distribution of
given
is, for all possible realizations of
, equal to the unconditional distribution of
. For discrete random variables this means
for all possible
and
with
. For continuous random variables
and
, having a
joint density function, it means
for all possible
and
with
.
Properties
Seen as a function of
for given
,
is a probability mass function and so the sum over all
(or integral if it is a conditional probability density) is 1. Seen as a function of
for given
, it is a
likelihood function
The likelihood function (often simply called the likelihood) represents the probability of random variable realizations conditional on particular values of the statistical parameters. Thus, when evaluated on a given sample, the likelihood funct ...
, so that the sum over all
need not be 1.
Additionally, a marginal of a joint distribution can be expressed as the expectation of the corresponding conditional distribution. For instance,
.
Measure-theoretic formulation
Let
be a probability space,
a
-field in
. Given
, the
Radon-Nikodym theorem implies that there is a
-measurable random variable
, called the conditional probability, such that
for every
, and such a random variable is uniquely defined up to sets of probability zero. A conditional probability is called
regular if
is a
probability measure
In mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as ''countable additivity''. The difference between a probability measure and the more gener ...
on
for all
a.e.
Special cases:
* For the trivial sigma algebra
, the conditional probability is the constant function
* If
, then
, the indicator function (defined below).
Let
be a
-valued random variable. For each
, define
For any
, the function
is called the
conditional probability
In probability theory, conditional probability is a measure of the probability of an event occurring, given that another event (by assumption, presumption, assertion or evidence) has already occurred. This particular method relies on event B occur ...
distribution of
given
. If it is a probability measure on
, then it is called
regular.
For a real-valued random variable (with respect to the Borel
-field
on
), every conditional probability distribution is regular.
[ Billingsley (1995), p. 439] In this case,
almost surely.
Relation to conditional expectation
For any event
, define the
indicator function
In mathematics, an indicator function or a characteristic function of a subset of a set is a function that maps elements of the subset to one, and all other elements to zero. That is, if is a subset of some set , one has \mathbf_(x)=1 if x\i ...
:
:
which is a random variable. Note that the expectation of this random variable is equal to the probability of ''A'' itself:
:
Given a
-field
, the conditional probability
is a version of the
conditional expectation
In probability theory, the conditional expectation, conditional expected value, or conditional mean of a random variable is its expected value – the value it would take “on average” over an arbitrarily large number of occurrences – give ...
of the indicator function for
:
:
An expectation of a random variable with respect to a regular conditional probability is equal to its conditional expectation.
See also
*
Conditioning (probability)
Beliefs depend on the available information. This idea is formalized in probability theory by conditioning. Conditional probabilities, conditional expectations, and conditional probability distributions are treated on three levels: discrete prob ...
*
Conditional probability
In probability theory, conditional probability is a measure of the probability of an event occurring, given that another event (by assumption, presumption, assertion or evidence) has already occurred. This particular method relies on event B occur ...
*
Regular conditional probability
*
Bayes' theorem
In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule), named after Thomas Bayes, describes the probability of an event, based on prior knowledge of conditions that might be related to the event. For examp ...
References
Citations
Sources
*
{{refend
Theory of probability distributions
Distribution Distribution may refer to:
Mathematics
*Distribution (mathematics), generalized functions used to formulate solutions of partial differential equations
* Probability distribution, the probability of a particular value or value range of a vari ...