Likelihoodist
   HOME

TheInfoList



OR:

Likelihoodist statistics or likelihoodism is an approach to
statistics Statistics (from German language, German: ''wikt:Statistik#German, Statistik'', "description of a State (polity), state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of ...
that exclusively or primarily uses the
likelihood function The likelihood function (often simply called the likelihood) represents the probability of random variable realizations conditional on particular values of the statistical parameters. Thus, when evaluated on a given sample, the likelihood funct ...
. Likelihoodist statistics is a more minor school than the main approaches of
Bayesian statistics Bayesian statistics is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a ''degree of belief'' in an event. The degree of belief may be based on prior knowledge about the event, ...
and
frequentist statistics Frequentist inference is a type of statistical inference based in frequentist probability, which treats “probability” in equivalent terms to “frequency” and draws conclusions from sample-data by means of emphasizing the frequency or pro ...
, but has some adherents and applications. The central idea of likelihoodism is the
likelihood principle In statistics, the likelihood principle is the proposition that, given a statistical model, all the evidence in a sample relevant to model parameters is contained in the likelihood function. A likelihood function arises from a probability density f ...
: data are interpreted as
evidence Evidence for a proposition is what supports this proposition. It is usually understood as an indication that the supported proposition is true. What role evidence plays and how it is conceived varies from field to field. In epistemology, evidenc ...
, and the strength of the evidence is measured by the likelihood function. Beyond this, there are significant differences within likelihood approaches: "orthodox" likelihoodists consider data ''only'' as evidence, and do not use it as the basis of
statistical inference Statistical inference is the process of using data analysis to infer properties of an underlying probability distribution, distribution of probability.Upton, G., Cook, I. (2008) ''Oxford Dictionary of Statistics'', OUP. . Inferential statistical ...
, while others make inferences based on likelihood, but without using
Bayesian inference Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, a ...
or
frequentist inference Frequentist inference is a type of statistical inference based in frequentist probability, which treats “probability” in equivalent terms to “frequency” and draws conclusions from sample-data by means of emphasizing the frequency or pro ...
. Likelihoodism is thus criticized for either not providing a basis for belief or action (if it fails to make inferences), or not satisfying the requirements of these other schools. The likelihood function is also used in Bayesian statistics and frequentist statistics, but they differ in how it is used. Some likelihoodists consider their use of likelihood as an alternative to other approaches, while others consider it complementary and compatible with other approaches; see .


Relation with other theories


Criticism


History

Likelihoodism as a distinct school dates to , which gives a systematic treatment of statistics, based on likelihood. This built on significant earlier work; see for a contemporary review. While comparing ratios of probabilities dates to early statistics and probability, notably
Bayesian inference Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, a ...
as developed by
Pierre-Simon Laplace Pierre-Simon, marquis de Laplace (; ; 23 March 1749 – 5 March 1827) was a French scholar and polymath whose work was important to the development of engineering, mathematics, statistics, physics, astronomy, and philosophy. He summarized ...
from the late 1700s,
likelihood The likelihood function (often simply called the likelihood) represents the probability of random variable realizations conditional on particular values of the statistical parameters. Thus, when evaluated on a given sample, the likelihood funct ...
as a distinct concept is due to
Ronald Fisher Sir Ronald Aylmer Fisher (17 February 1890 – 29 July 1962) was a British polymath who was active as a mathematician, statistician, biologist, geneticist, and academic. For his work in statistics, he has been described as "a genius who a ...
in . Likelihood played an important role in Fisher's statistics, but he developed and used many non-likelihood frequentist techniques as well. His late writings, notably , emphasize likelihood more strongly, and can be considered a precursor to a systematic theory of likelihoodism. The
likelihood principle In statistics, the likelihood principle is the proposition that, given a statistical model, all the evidence in a sample relevant to model parameters is contained in the likelihood function. A likelihood function arises from a probability density f ...
was proposed in 1962 by several authors, notably , , and , and followed by the
law of likelihood In statistics, the likelihood principle is the proposition that, given a statistical model, all the evidence in a sample relevant to model parameters is contained in the likelihood function. A likelihood function arises from a probability density f ...
in ; these laid the foundation for likelihoodism. See for early history. While Edwards's version of likelihoodism considered likelihood as only evidence, which was followed by , others proposed inference based only on likelihood, notably as extensions of
maximum likelihood estimation In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statis ...
. Notable is
John Nelder John Ashworth Nelder (8 October 1924 – 7 August 2010) was a British statistician known for his contributions to experimental design, analysis of variance, computational statistics, and statistical theory. Contributions Nelder's work was infl ...
, who declared in : Textbooks that take a likelihoodist approach include the following: , , , , and . A collection of relevant papers is given by .


See also

*
Akaike information criterion The Akaike information criterion (AIC) is an estimator of prediction error and thereby relative quality of statistical models for a given set of data. Given a collection of models for the data, AIC estimates the quality of each model, relative to e ...
*
Foundations of statistics The foundations of statistics concern the epistemological debate in statistics over how one should conduct inductive inference from data. Among the issues considered in statistical inference are the question of Bayesian inference versus frequentist ...
*
Likelihood ratio test In statistics, the likelihood-ratio test assesses the goodness of fit of two competing statistical models based on the ratio of their likelihoods, specifically one found by maximization over the entire parameter space and another found after im ...


References

* * * ''(With discussion.)'' * * * * * * * * * * * * * *


Further reading

*


External links

* {{cite web , url=https://plato.stanford.edu/entries/logic-inductive/sup-likelihood.html , title=Likelihood Ratios, Likelihoodism, and the Law of Likelihood , work=Stanford Encyclopedia of Philosophy , access-date=2019-03-14