In
probability theory
Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expre ...
, inverse probability is an old term for the
probability distribution
In probability theory and statistics, a probability distribution is a Function (mathematics), function that gives the probabilities of occurrence of possible events for an Experiment (probability theory), experiment. It is a mathematical descri ...
of an unobserved variable.
Today, the problem of determining an unobserved variable (by whatever method) is called
inferential statistics. The method of inverse probability (assigning a probability distribution to an unobserved variable) is called
Bayesian probability
Bayesian probability ( or ) is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quant ...
, the distribution of data given the unobserved variable is the
likelihood function
A likelihood function (often simply called the likelihood) measures how well a statistical model explains observed data by calculating the probability of seeing that data under different parameter values of the model. It is constructed from the ...
(which does not by itself give a probability distribution for the parameter), and the distribution of an unobserved variable, given both data and a
prior distribution, is the
posterior distribution. The development of the field and terminology from "inverse probability" to "Bayesian probability" is described by .

The term "inverse probability" appears in an 1837 paper of
De Morgan, in reference to
Laplace's method of probability (developed in a 1774 paper, which independently discovered and popularized Bayesian methods, and a 1812 book), though the term "inverse probability" does not occur in these. Fisher uses the term in , referring to "the fundamental paradox of inverse probability" as the source of the confusion between statistical terms that refer to the true value to be estimated, with the actual value arrived at by the estimation method, which is subject to error. Later Jeffreys uses the term in his defense of the methods of Bayes and Laplace, in . The term "Bayesian", which displaced "inverse probability", was introduced by
Ronald Fisher
Sir Ronald Aylmer Fisher (17 February 1890 – 29 July 1962) was a British polymath who was active as a mathematician, statistician, biologist, geneticist, and academic. For his work in statistics, he has been described as "a genius who a ...
in 1950. Inverse probability, variously interpreted, was the dominant approach to statistics until the development of
frequentism in the early 20th century by
Ronald Fisher
Sir Ronald Aylmer Fisher (17 February 1890 – 29 July 1962) was a British polymath who was active as a mathematician, statistician, biologist, geneticist, and academic. For his work in statistics, he has been described as "a genius who a ...
,
Jerzy Neyman
Jerzy Spława-Neyman (April 16, 1894 – August 5, 1981; ) was a Polish mathematician and statistician who first introduced the modern concept of a confidence interval into statistical hypothesis testing and, with Egon Pearson, revised Ronald Fis ...
and
Egon Pearson
Egon Sharpe Pearson (11 August 1895 – 12 June 1980) was one of three children of Karl Pearson and Maria, née Sharpe, and, like his father, a British statistician.
Career
Pearson was educated at Winchester College and Trinity College ...
. Following the development of frequentism, the terms
frequentist and
Bayesian developed to contrast these approaches, and became common in the 1950s.
Details
In modern terms, given a probability distribution ''p''(''x'', θ) for an observable quantity ''x'' conditional on an unobserved variable θ, the "inverse probability" is the
posterior distribution ''p''(θ, ''x''), which depends both on the likelihood function (the inversion of the probability distribution) and a prior distribution. The distribution ''p''(''x'', θ) itself is called the direct probability.
The ''inverse probability problem'' (in the 18th and 19th centuries) was the problem of estimating a parameter from experimental data in the experimental sciences, especially
astronomy
Astronomy is a natural science that studies celestial objects and the phenomena that occur in the cosmos. It uses mathematics, physics, and chemistry in order to explain their origin and their overall evolution. Objects of interest includ ...
and
biology
Biology is the scientific study of life and living organisms. It is a broad natural science that encompasses a wide range of fields and unifying principles that explain the structure, function, growth, History of life, origin, evolution, and ...
. A simple example would be the problem of estimating the position of a star in the sky (at a certain time on a certain date) for purposes of
navigation
Navigation is a field of study that focuses on the process of monitoring and controlling the motion, movement of a craft or vehicle from one place to another.Bowditch, 2003:799. The field of navigation includes four general categories: land navig ...
. Given the data, one must estimate the true position (probably by averaging). This problem would now be considered one of
inferential statistics.
The terms "direct probability" and "inverse probability" were in use until the middle part of the 20th century, when the terms "
likelihood function
A likelihood function (often simply called the likelihood) measures how well a statistical model explains observed data by calculating the probability of seeing that data under different parameter values of the model. It is constructed from the ...
" and "posterior distribution" became prevalent.
See also
*
Bayesian probability
Bayesian probability ( or ) is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quant ...
*
Bayes' theorem
Bayes' theorem (alternatively Bayes' law or Bayes' rule, after Thomas Bayes) gives a mathematical rule for inverting Conditional probability, conditional probabilities, allowing one to find the probability of a cause given its effect. For exampl ...
References
*
** See reprint in
*
*
{{DEFAULTSORT:Inverse Probability
Statistical inference
Probability interpretations
Bayesian statistics