Radical probabilism is a hypothesis in
philosophy
Philosophy (from , ) is the systematized study of general and fundamental questions, such as those about existence, reason, knowledge, values, mind, and language. Such questions are often posed as problems to be studied or resolved. Some ...
, in particular
epistemology
Epistemology (; ), or the theory of knowledge, is the branch of philosophy concerned with knowledge. Epistemology is considered a major subfield of philosophy, along with other major subfields such as ethics, logic, and metaphysics.
Episte ...
, and
probability theory
Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set o ...
that holds that no facts are known for certain. That view holds profound implications for
statistical inference
Statistical inference is the process of using data analysis to infer properties of an underlying probability distribution, distribution of probability.Upton, G., Cook, I. (2008) ''Oxford Dictionary of Statistics'', OUP. . Inferential statistical ...
. The philosophy is particularly associated with
Richard Jeffrey
Richard Carl Jeffrey (August 5, 1926 – November 9, 2002) was an American philosopher, logician, and probability theorist. He is best known for developing and championing the philosophy of radical probabilism and the associated heuristic of ...
who wittily characterised it with the ''dictum'' "It's probabilities
all the way down."
Background
Bayes' theorem states a rule for updating a
probability
Probability is the branch of mathematics concerning numerical descriptions of how likely an Event (probability theory), event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and ...
conditioned on other information. In 1967, Ian Hacking argued that in a static form, Bayes' theorem only connects probabilities that are held simultaneously; it does not tell the learner how to update probabilities when new evidence becomes available over time, contrary to what contemporary
Bayesians suggested.
According to Hacking, adopting Bayes' theorem is a temptation. Suppose that a learner forms probabilities ''P''
old(''A'' & ''B'') = ''p'' and ''P''
old(''B'') = ''q''.
If the learner subsequently learns that ''B'' is true, nothing in the
axioms of probability
The Kolmogorov axioms are the foundations of probability theory introduced by Russian mathematician Andrey Kolmogorov in 1933. These axioms remain central and have direct contributions to mathematics, the physical sciences, and real-world probabili ...
or the results derived therefrom tells him how to behave. He might be tempted to adopt Bayes' theorem by analogy and set his ''P''
new(''A'') = ''P''
old(''A'' , ''B'') = ''p''/''q''.
In fact, that step, Bayes' rule of updating, can be justified, as necessary and sufficient, through a ''dynamic''
Dutch book
In gambling, a Dutch book or lock is a set of odds and bets, established by the bookmaker, that ensures that the bookmaker will profit—at the expense of the gamblers—regardless of the outcome of the event (a horse race, for example) on which ...
argument that is additional to the arguments used to justify the probability axioms. This argument was first put forward by
David Lewis in the 1970s though he never published it. The dynamic Dutch book argument for Bayesian updating has been criticised by Hacking, H. Kyburg, D. Christensen and P. Maher. It was defended by
Brian Skyrms
Brian Skyrms (born 1938) is an American philosopher, Distinguished Professor of Logic and Philosophy of Science and Economics at the University of California, Irvine, and a professor of philosophy at Stanford University. He has worked on problem ...
.
Certain and uncertain knowledge
That works when the new data is certain.
C. I. Lewis
Clarence Irving Lewis (April 12, 1883 – February 3, 1964), usually cited as C. I. Lewis, was an American academic philosopher. He is considered the progenitor of modern modal logic and the founder of conceptual pragmatism. First a noted logic ...
had argued that "If anything is to be probable then something must be certain". There must, on Lewis' account, be some certain facts on which probabilities were
conditioned. However, the principle known as
Cromwell's rule
Cromwell's rule, named by statistician Dennis Lindley, states that the use of prior probabilities of 1 ("the event will definitely occur") or 0 ("the event will definitely not occur") should be avoided, except when applied to statements that are ...
declares that nothing, apart from a logical law, if that, can ever be known for certain. Jeffrey famously rejected Lewis' ''dictum''. He later quipped, "It's probabilities all the way down," a reference to the "
turtles all the way down" metaphor for the
infinite regress
An infinite regress is an infinite series of entities governed by a recursive principle that determines how each entity in the series depends on or is produced by its predecessor. In the epistemic regress, for example, a belief is justified beca ...
problem. He called this position ''radical probabilism''.
Conditioning on an uncertainty – probability kinematics
In this case Bayes' rule isn't able to capture a mere subjective change in the probability of some critical fact. The new evidence may not have been anticipated or even be capable of being articulated after the event. It seems reasonable, as a starting position, to adopt the
law of total probability
In probability theory, the law (or formula) of total probability is a fundamental rule relating marginal probabilities to conditional probabilities. It expresses the total probability of an outcome which can be realized via several distinct eve ...
and extend it to updating in much the same way as was Bayes' theorem.
: ''P''
new(''A'') = ''P''
old(''A'' , ''B'')''P''
new(''B'') + ''P''
old(''A'' , not-''B'')''P''
new(not-''B'')
Adopting such a rule is sufficient to avoid a Dutch book but not necessary. Jeffrey advocated this as a rule of updating under radical probabilism and called it probability kinematics. Others have named it Jeffrey conditioning.
Alternatives to probability kinematics
Probability kinematics is not the only sufficient updating rule for radical probabilism. Others have been advocated including
E. T. Jaynes
Edwin Thompson Jaynes (July 5, 1922 – April 30, 1998) was the Wayman Crow Distinguished Professor of Physics at Washington University in St. Louis. He wrote extensively on statistical mechanics and on foundations of probability and statisti ...
'
maximum entropy principle
The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge about a system is the one with largest entropy, in the context of precisely stated prior data (such as a proposition ...
, and Skyrms'
principle of reflection
A principle is a proposition or value that is a guide for behavior or evaluation. In law, it is a rule that has to be or usually is to be followed. It can be desirably followed, or it can be an inevitable consequence of something, such as the law ...
. It turns out that probability kinematics is a special case of maximum entropy inference. However, maximum entropy is not a generalisation of all such sufficient updating rules.
Selected bibliography
* Jeffrey, R (1990) ''The Logic of Decision''. 2nd ed. University of Chicago Press.
* — (1992) ''Probability and the Art of Judgment''. Cambridge University Press.
* — (2004) ''Subjective Probability: The Real Thing''. Cambridge University Press.
*Skyrms, B (2012) ''From Zeno to Arbitrage: Essays on Quantity, Coherence & Induction''. Oxford University Press (Features most of the papers cited below.)
References
External links
Stanford Encyclopedia of Philosophy entry on Bayes' theorem
{{DEFAULTSORT:Radical probabilism
Bayesian statistics
Epistemology
Probability theory