Expectation propagation (EP) is a technique in
Bayesian machine learning.
EP finds approximations to a
probability distribution
In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomeno ...
.
It uses an
iterative approach that uses the factorization structure of the target distribution.
It differs from other Bayesian approximation approaches such as
variational Bayesian methods
Variational Bayesian methods are a family of techniques for approximating intractable integrals arising in Bayesian inference and machine learning. They are typically used in complex statistical models consisting of observed variables (usually ...
.
More specifically, suppose we wish to approximate an intractable probability distribution
with a tractable distribution
. Expectation propagation achieves this approximation by minimizing the
Kullback-Leibler divergence .
Variational Bayesian methods minimize
instead.
If
is a Gaussian
, then
is minimized with
and
being equal to the
mean
There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value ( magnitude and sign) of a given data set.
For a data set, the '' ari ...
of
and the
covariance
In probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the le ...
of
, respectively; this is called
moment matching.
Applications
Expectation propagation via moment matching plays a vital role in approximation for
indicator functions that appear when deriving the
message passing equations for
TrueSkill.
References
*
External links
Minka's EP papers
Machine learning
Bayesian statistics
{{compsci-stub