HOME

TheInfoList



OR:

Imprecise probability generalizes
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
to allow for partial probability specifications, and is applicable when information is scarce, vague, or conflicting, in which case a unique
probability distribution In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomeno ...
may be hard to identify. Thereby, the theory aims to represent the available knowledge more accurately. Imprecision is useful for dealing with
expert elicitation In science, engineering, and research, expert elicitation is the synthesis of opinions of authorities of a subject where there is uncertainty due to insufficient data or when such data is unattainable because of physical constraints or lack of res ...
, because: * People have a limited ability to determine their own subjective probabilities and might find that they can only provide an interval. * As an interval is compatible with a range of opinions, the analysis ought to be more convincing to a range of different people.


Introduction

Uncertainty is traditionally modelled by a
probability Probability is the branch of mathematics concerning numerical descriptions of how likely an Event (probability theory), event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and ...
distribution, as developed by
Kolmogorov Andrey Nikolaevich Kolmogorov ( rus, Андре́й Никола́евич Колмого́ров, p=ɐnˈdrʲej nʲɪkɐˈlajɪvʲɪtɕ kəlmɐˈɡorəf, a=Ru-Andrey Nikolaevich Kolmogorov.ogg, 25 April 1903 – 20 October 1987) was a Sovi ...
,
Laplace Pierre-Simon, marquis de Laplace (; ; 23 March 1749 – 5 March 1827) was a French scholar and polymath whose work was important to the development of engineering, mathematics, statistics, physics, astronomy, and philosophy. He summariz ...
, de Finetti, Ramsey, Cox, Lindley, and many others. However, this has not been unanimously accepted by scientists, statisticians, and probabilists: it has been argued that some modification or broadening of probability theory is required, because one may not always be able to provide a probability for every event, particularly when only little information or data is available—an early example of such criticism is
Boole George Boole (; 2 November 1815 – 8 December 1864) was a largely self-taught English mathematician, philosopher, and logician, most of whose short career was spent as the first professor of mathematics at Queen's College, Cork in Irel ...
's critique of
Laplace Pierre-Simon, marquis de Laplace (; ; 23 March 1749 – 5 March 1827) was a French scholar and polymath whose work was important to the development of engineering, mathematics, statistics, physics, astronomy, and philosophy. He summariz ...
's work—, or when we wish to model probabilities that a group agrees with, rather than those of a single individual. Perhaps the most common generalization is to replace a single probability specification with an interval specification. Lower and upper probabilities, denoted by \underline(A) and \overline(A), or more generally, lower and upper expectations (previsions), aim to fill this gap. A lower probability function is superadditive but not necessarily additive, whereas an upper probability is subadditive. To get a general understanding of the theory, consider: *the special case with \underline(A)=\overline(A) for all events A is equivalent to a precise probability *\underline(A)=0 and \overline(A)=1 for all non-trivial events represents no constraint at all on the specification of P(A) We then have a flexible continuum of more or less precise models in between. Some approaches, summarized under the name ''nonadditive probabilities'', directly use one of these set functions, assuming the other one to be naturally defined such that \underline(A^c)= 1-\overline(A), with A^c the complement of A. Other related concepts understand the corresponding intervals underline(A), \overline(A)/math> for all events as the basic entity.


History

The idea to use imprecise probability has a long history. The first formal treatment dates back at least to the middle of the nineteenth century, by
George Boole George Boole (; 2 November 1815 – 8 December 1864) was a largely self-taught English mathematician, philosopher, and logician, most of whose short career was spent as the first professor of mathematics at Queen's College, Cork in ...
, who aimed to reconcile the theories of
logic Logic is the study of correct reasoning. It includes both formal and informal logic. Formal logic is the science of deductively valid inferences or of logical truths. It is a formal science investigating how conclusions follow from premis ...
and probability. In the 1920s, in '' A Treatise on Probability'', Keynes formulated and applied an explicit interval estimate approach to probability. Work on imprecise probability models proceeded fitfully throughout the 20th century, with important contributions by
Bernard Koopman Bernard Osgood Koopman (January 19, 1900 – August 18, 1981) was a French-born American mathematician, known for his work in ergodic theory, the foundations of probability, statistical theory and operations research. Education and work A ...
, C.A.B. Smith,
I.J. Good Irving John Good (9 December 1916 – 5 April 2009)The Times of 16-apr-09, http://www.timesonline.co.uk/tol/comment/obituaries/article6100314.ece was a British mathematician who worked as a cryptologist at Bletchley Park with Alan Turing. Afte ...
, Arthur Dempster,
Glenn Shafer Glenn Shafer (born November 21, 1946) is an American mathematician and statistician. He is the co-creator of Dempster–Shafer theory. He is a University Professor and Board of Governors Professor at Rutgers University. Early life and education S ...
, Peter M. Williams, Henry Kyburg, Isaac Levi, and Teddy Seidenfeld. At the start of the 1990s, the field started to gather some momentum, with the publication of Peter Walley's book ''Statistical Reasoning with Imprecise Probabilities'' (which is also where the term "imprecise probability" originates). The 1990s also saw important works by Kuznetsov, and by Weichselberger, who both use the term ''interval probability''. Walley's theory extends the traditional subjective probability theory via buying and selling prices for gambles, whereas Weichselberger's approach generalizes
Kolmogorov Andrey Nikolaevich Kolmogorov ( rus, Андре́й Никола́евич Колмого́ров, p=ɐnˈdrʲej nʲɪkɐˈlajɪvʲɪtɕ kəlmɐˈɡorəf, a=Ru-Andrey Nikolaevich Kolmogorov.ogg, 25 April 1903 – 20 October 1987) was a Sovi ...
's axioms without imposing an interpretation. Standard consistency conditions relate upper and lower probability assignments to non-empty closed convex sets of probability distributions. Therefore, as a welcome by-product, the theory also provides a formal framework for models used in
robust statistics Robust statistics are statistics with good performance for data drawn from a wide range of probability distributions, especially for distributions that are not normal. Robust statistical methods have been developed for many common problems, suc ...
and
non-parametric statistics Nonparametric statistics is the branch of statistics that is not based solely on parametrized families of probability distributions (common examples of parameters are the mean and variance). Nonparametric statistics is based on either being distri ...
. Included are also concepts based on Choquet integration, and so-called two-monotone and totally monotone capacities, which have become very popular in
artificial intelligence Artificial intelligence (AI) is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machine A machine is a physical system using Power (physics), power to apply Force, forces and control Motion, moveme ...
under the name (Dempster–Shafer) belief functions. Moreover, there is a strong connection to Shafer and Vovk's notion of game-theoretic probability.


Mathematical models

The term "imprecise probability" is somewhat misleading in that precision is often mistaken for accuracy, whereas an imprecise representation may be more accurate than a spuriously precise representation. In any case, the term appears to have become established in the 1990s, and covers a wide range of extensions of the theory of
probability Probability is the branch of mathematics concerning numerical descriptions of how likely an Event (probability theory), event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and ...
, including: * credal sets, or sets of probability distributions * previsions * Random set theory * Dempster–Shafer evidence theory * lower and upper probabilities, or interval probabilities * belief functions * possibility and necessity measures * lower and upper previsions * comparative probability orderings * partial preference orderings * sets of desirable gambles * p-boxes * robust Bayes methods


Interpretation of imprecise probabilities

A unification of many of the above-mentioned imprecise probability theories was proposed by Walley, although this is in no way the first attempt to formalize imprecise probabilities. In terms of
probability interpretations The word probability has been used in a variety of ways since it was first applied to the mathematical study of games of chance. Does probability measure the real, physical, tendency of something to occur, or is it a measure of how strongly one be ...
, Walley's formulation of imprecise probabilities is based on the subjective variant of the Bayesian interpretation of probability. Walley defines upper and lower probabilities as special cases of upper and lower previsions and the gambling framework advanced by
Bruno de Finetti Bruno de Finetti (13 June 1906 – 20 July 1985) was an Italian probabilist statistician and actuary, noted for the "operational subjective" conception of probability. The classic exposition of his distinctive theory is the 1937 "La prévision: ...
. In simple terms, a decision maker's lower prevision is the highest price at which the decision maker is sure he or she would buy a gamble, and the upper prevision is the lowest price at which the decision maker is sure he or she would buy the opposite of the gamble (which is equivalent to selling the original gamble). If the upper and lower previsions are equal, then they jointly represent the decision maker's fair price for the gamble, the price at which the decision maker is willing to take either side of the gamble. The existence of a fair price leads to precise probabilities. The allowance for imprecision, or a gap between a decision maker's upper and lower previsions, is the primary difference between precise and imprecise probability theories. Such gaps arise naturally in betting markets that happen to be financially illiquid due to asymmetric information. This gap is also given by Henry Kyburg repeatedly for his interval probabilities, though he and Isaac Levi also give other reasons for intervals, or sets of distributions, representing states of belief.


Issues with imprecise probabilities

One issue with imprecise probabilities is that there is often an independent degree of caution or boldness inherent in the use of one interval, rather than a wider or narrower one. This may be a degree of confidence, degree of
fuzzy membership In mathematics, the membership function of a fuzzy set is a generalization of the indicator function for classical sets. In fuzzy logic, it represents the degree of truth as an extension of valuation. Degrees of truth are often confused with pro ...
, or threshold of acceptance. This is not as much of a problem for intervals that are lower and upper bounds derived from a set of probability distributions, e.g., a set of priors followed by conditionalization on each member of the set. However, it can lead to the question why some distributions are included in the set of priors and some are not. Another issue is why one can be precise about two numbers, a lower bound and an upper bound, rather than a single number, a point probability. This issue may be merely rhetorical, as the robustness of a model with intervals is inherently greater than that of a model with point-valued probabilities. It does raise concerns about inappropriate claims of precision at endpoints, as well as for point values. A more practical issue is what kind of decision theory can make use of imprecise probabilities. For fuzzy measures, there is the work of Ronald R. Yager. For convex sets of distributions, Levi's works are instructive. Another approach asks whether the threshold controlling the boldness of the interval matters more to a decision than simply taking the average or using a Hurwicz decision rule. Other approaches appear in the literature.


See also

*
Ambiguity aversion In decision theory and economics, ambiguity aversion (also known as uncertainty aversion) is a preference for known risks over unknown risks. An ambiguity-averse individual would rather choose an alternative where the probability distribution of ...
* Robust decision making *
Imprecise Dirichlet process In probability theory and statistics, the Dirichlet process (DP) is one of the most popular Bayesian nonparametric models. It was introduced by Thomas Ferguson as a prior over probability distributions. A Dirichlet process \mathrm\left(s,G_0\right ...


References


External links


The Society for Imprecise Probability: Theories and Applications
* ttp://ipg.idsia.ch The imprecise probability group at IDSIAbr>Stanford Encyclopedia of Philosophy article on Imprecise Probabilities
{{DEFAULTSORT:Imprecise Probability Probability theory Statistical approximations