HOME

TheInfoList



OR:

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior
belief A belief is an attitude that something is the case, or that some proposition is true. In epistemology, philosophers use the term "belief" to refer to attitudes about the world which can be either true or false. To believe something is to take i ...
s or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for
emotion Emotions are mental states brought on by neurophysiological changes, variously associated with thoughts, feelings, behavioral responses, and a degree of pleasure or displeasure. There is currently no scientific consensus on a definition. ...
ally charged issues, and for deeply entrenched beliefs. Confirmation bias cannot be eliminated, but it can be managed, for example, by education and training in
critical thinking Critical thinking is the analysis of available facts, evidence, observations, and arguments to form a judgement. The subject is complex; several different definitions exist, which generally include the rational, skeptical, and unbiased analysis ...
skills. Biased search for information, biased interpretation of this information, and biased memory recall, have been invoked to explain four specific effects: # ''
attitude polarization In social psychology, group polarization refers to the tendency for a group to make decisions that are more extreme than the initial inclination of its members. These more extreme decisions are towards greater risk if individuals' initial tendenci ...
'' (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence) # ''
belief perseverance Belief perseverance (also known as conceptual conservatism) is maintaining a belief despite new information that firmly contradicts it. Such beliefs may even be strengthened when others attempt to present evidence debunking them, a phenomenon kn ...
'' (when beliefs persist after the evidence for them is shown to be false) # the ''irrational primacy effect'' (a greater reliance on information encountered early in a series) # ''
illusory correlation In psychology, illusory correlation is the phenomenon of perceiving a relationship between variables (typically people, events, or behaviors) even when no such relationship exists. A false association may be formed because rare or novel occurren ...
'' (when people falsely perceive an association between two events or situations). A series of psychological experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives. Explanations for the observed biases include
wishful thinking Wishful thinking is the formation of beliefs based on what might be pleasing to imagine, rather than on evidence, rationality, or reality. It is a product of resolving conflicts between belief and desire. Methodologies to examine wishful think ...
and the limited human capacity to process information. Another proposal is that people show confirmation bias because they are pragmatically assessing the costs of being wrong, rather than investigating in a neutral, scientific way. Flawed decisions due to confirmation bias have been found in a wide range of political, organizational, financial and scientific contexts. These biases contribute to
overconfidence Confidence is a state of being clear-headed either that a hypothesis or prediction is correct or that a chosen course of action is the best or most effective. Confidence comes from a Latin word 'fidere' which means "to trust"; therefore, having ...
in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. For example, confirmation bias produces systematic errors in scientific research based on
inductive reasoning Inductive reasoning is a method of reasoning in which a general principle is derived from a body of observations. It consists of making broad generalizations based on specific observations. Inductive reasoning is distinct from ''deductive'' re ...
(the gradual accumulation of supportive evidence). Similarly, a police detective may identify a suspect early in an investigation, but then may only seek confirming rather than disconfirming evidence. A medical practitioner may prematurely focus on a particular disorder early in a diagnostic session, and then seek only confirming evidence. In
social media Social media are interactive media technologies that facilitate the creation and sharing of information, ideas, interests, and other forms of expression through virtual communities and networks. While challenges to the definition of ''social medi ...
, confirmation bias is amplified by the use of
filter bubble A filter bubble or ideological frame is a state of intellectual isolationTechnopediaDefinition – What does Filter Bubble mean?, Retrieved October 10, 2017, "....A filter bubble is the intellectual isolation, that can occur when websites make us ...
s, or "algorithmic editing", which display to individuals only information they are likely to agree with, while excluding opposing views.


Definition and context

Confirmation bias, a phrase coined by English psychologist
Peter Wason Peter Cathcart Wason (22 April 1924 – 17 April 2003) was a cognitive psychologist at University College, London who pioneered the Psychology of Reasoning. He progressed explanations as to why people make certain consistent mistakes in logical r ...
, is the tendency of people to favor information that confirms or strengthens their beliefs or values and is difficult to dislodge once affirmed. Confirmation bias is an example of a
cognitive bias A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, m ...
. Confirmation bias (or confirmatory bias) has also been termed myside bias. "Congeniality bias" has also been used. Confirmation biases are effects in
information processing Information processing is the change (processing) of information in any manner detectable by an observer. As such, it is a process that ''describes'' everything that happens (changes) in the universe, from the falling of a rock (a change in posit ...
. They differ from what is sometimes called the '' behavioral confirmation effect'', commonly known as ''
self-fulfilling prophecy A self-fulfilling prophecy is a prediction that comes true at least in part as a result of a person's or group of persons' belief or expectation that said prediction would come true. This suggests that people's beliefs influence their actions. ...
'', in which a person's expectations influence their own behavior, bringing about the expected result. Some psychologists restrict the term "confirmation bias" to selective collection of evidence that supports what one already believes while ignoring or rejecting evidence that supports a different conclusion. Others apply the term more broadly to the tendency to preserve one's existing beliefs when searching for evidence, interpreting it, or recalling it from memory. Confirmation bias is a result of automatic, unintentional strategies rather than deliberate deception. Confirmation bias cannot be avoided or eliminated, but only managed by improving education and critical thinking skills. Confirmation bias is a broad construct that has a number of possible explanations, namely: hypothesis-testing by falsification, hypothesis testing by positive test strategy, and information processing explanations.


Types of confirmation bias


Biased search for information

Experiments have found repeatedly that people tend to test hypotheses in a one-sided way, by searching for evidence consistent with their current
hypothesis A hypothesis (plural hypotheses) is a proposed explanation for a phenomenon. For a hypothesis to be a scientific hypothesis, the scientific method requires that one can test it. Scientists generally base scientific hypotheses on previous obse ...
. Rather than searching through all the relevant evidence, they phrase questions to receive an affirmative answer that supports their theory. They look for the consequences that they would expect if their hypothesis was true, rather than what would happen if it was false. For example, someone using yes/no questions to find a number they suspect to be the number 3 might ask, "Is it an
odd number In mathematics, parity is the property of an integer of whether it is even or odd. An integer is even if it is a multiple of two, and odd if it is not.. For example, −4, 0, 82 are even because \begin -2 \cdot 2 &= -4 \\ 0 \cdot 2 &= 0 \\ 41 ...
?" People prefer this type of question, called a "positive test", even when a negative test such as "Is it an even number?" would yield exactly the same information. However, this does not mean that people seek tests that guarantee a positive answer. In studies where subjects could select either such pseudo-tests or genuinely diagnostic ones, they favored the genuinely diagnostic. The preference for positive tests in itself is not a bias, since positive tests can be highly informative. However, in combination with other effects, this strategy can confirm existing beliefs or assumptions, independently of whether they are true. In real-world situations, evidence is often complex and mixed. For example, various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior. Thus any search for evidence in favor of a hypothesis is likely to succeed. One illustration of this is the way the phrasing of a question can significantly change the answer. For example, people who are asked, "Are you happy with your social life?" report greater satisfaction than those asked, "Are you ''un''happy with your social life?" Even a small change in a question's wording can affect how people search through available information, and hence the conclusions they reach. This was shown using a fictional child custody case. Participants read that Parent A was moderately suitable to be the guardian in multiple ways. Parent B had a mix of salient positive and negative qualities: a close relationship with the child but a job that would take them away for long periods of time. When asked, "Which parent should have custody of the child?" the majority of participants chose Parent B, looking mainly for positive attributes. However, when asked, "Which parent should be denied custody of the child?" they looked for negative attributes and the majority answered that Parent B should be denied custody, implying that Parent A should have custody. via Similar studies have demonstrated how people engage in a biased search for information, but also that this phenomenon may be limited by a preference for genuine diagnostic tests. In an initial experiment, participants rated another person on the introversion–extroversion personality dimension on the basis of an interview. They chose the interview questions from a given list. When the interviewee was introduced as an introvert, the participants chose questions that presumed introversion, such as, "What do you find unpleasant about noisy parties?" When the interviewee was described as extroverted, almost all the questions presumed extroversion, such as, "What would you do to liven up a dull party?" These loaded questions gave the interviewees little or no opportunity to falsify the hypothesis about them. A later version of the experiment gave the participants less presumptive questions to choose from, such as, "Do you shy away from social interactions?" Participants preferred to ask these more diagnostic questions, showing only a weak bias towards positive tests. This pattern, of a main preference for diagnostic tests and a weaker preference for positive tests, has been replicated in other studies. Personality traits influence and interact with biased search processes. Individuals vary in their abilities to defend their attitudes from external attacks in relation to
selective exposure Selective may refer to: * Selective school, a school that admits students on the basis of some sort of selection criteria ** Selective school (New South Wales) Selective strength: the human body transitions between being weak and strong. This ran ...
. Selective exposure occurs when individuals search for information that is consistent, rather than inconsistent, with their personal beliefs. An experiment examined the extent to which individuals could refute arguments that contradicted their personal beliefs. People with high
confidence Confidence is a state of being clear-headed either that a hypothesis or prediction is correct or that a chosen course of action is the best or most effective. Confidence comes from a Latin word 'fidere' which means "to trust"; therefore, having ...
levels more readily seek out contradictory information to their personal position to form an argument. This can take the form of an ''oppositional news consumption'', where individuals seek opposing partisan news in order to counterargue. Individuals with low confidence levels do not seek out contradictory information and prefer information that supports their personal position. People generate and evaluate evidence in arguments that are biased towards their own beliefs and opinions. Heightened confidence levels decrease preference for information that supports individuals' personal beliefs. Another experiment gave participants a complex rule-discovery task that involved moving objects simulated by a computer. Objects on the computer screen followed specific laws, which the participants had to figure out. So, participants could "fire" objects across the screen to test their hypotheses. Despite making many attempts over a ten-hour session, none of the participants figured out the rules of the system. They typically attempted to confirm rather than falsify their hypotheses, and were reluctant to consider alternatives. Even after seeing objective evidence that refuted their working hypotheses, they frequently continued doing the same tests. Some of the participants were taught proper hypothesis-testing, but these instructions had almost no effect.


Biased interpretation of information

Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased. A team at
Stanford University Stanford University, officially Leland Stanford Junior University, is a private research university in Stanford, California. The campus occupies , among the largest in the United States, and enrolls over 17,000 students. Stanford is consider ...
conducted an experiment involving participants who felt strongly about capital punishment, with half in favor and half against it. Each participant read descriptions of two studies: a comparison of
U.S. state In the United States, a state is a constituent political entity, of which there are 50. Bound together in a political union, each state holds governmental jurisdiction over a separate and defined geographic territory where it shares its sover ...
s with and without the death penalty, and a comparison of murder rates in a state before and after the introduction of the death penalty. After reading a quick description of each study, the participants were asked whether their opinions had changed. Then, they read a more detailed account of each study's procedure and had to rate whether the research was well-conducted and convincing. In fact, the studies were fictional. Half the participants were told that one kind of study supported the deterrent effect and the other undermined it, while for other participants the conclusions were swapped. The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways. Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time," while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented." The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias", has been supported by other experiments. Another study of biased interpretation occurred during the 2004 U.S. presidential election and involved participants who reported having strong feelings about the candidates. They were shown apparently contradictory pairs of statements, either from Republican candidate
George W. Bush George Walker Bush (born July 6, 1946) is an American politician who served as the 43rd president of the United States from 2001 to 2009. A member of the Republican Party, Bush family, and son of the 41st president George H. W. Bush, he ...
, Democratic candidate
John Kerry John Forbes Kerry (born December 11, 1943) is an American attorney, politician and diplomat who currently serves as the first United States special presidential envoy for climate. A member of the Forbes family and the Democratic Party (Unite ...
or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether each individual's statements were inconsistent. There were strong differences in these evaluations, with participants much more likely to interpret statements from the candidate they opposed as contradictory. In this experiment, the participants made their judgments while in a
magnetic resonance imaging Magnetic resonance imaging (MRI) is a medical imaging technique used in radiology to form pictures of the anatomy and the physiological processes of the body. MRI scanners use strong magnetic fields, magnetic field gradients, and radio wave ...
(MRI) scanner which monitored their brain activity. As participants evaluated contradictory statements by their favored candidate,
emotion Emotions are mental states brought on by neurophysiological changes, variously associated with thoughts, feelings, behavioral responses, and a degree of pleasure or displeasure. There is currently no scientific consensus on a definition. ...
al centers of their brains were aroused. This did not happen with the statements by the other figures. The experimenters inferred that the different responses to the statements were not due to passive reasoning errors. Instead, the participants were actively reducing the
cognitive dissonance In the field of psychology, cognitive dissonance is the perception of contradictory information, and the mental toll of it. Relevant items of information include a person's actions, feelings, ideas, beliefs, values, and things in the environment. ...
induced by reading about their favored candidate's irrational or
hypocritical Hypocrisy is the practice of engaging in the same behavior or activity for which one criticizes another or the practice of claiming to have moral standards or beliefs to which one's own behavior does not conform. In moral psychology, it is th ...
behavior. Biases in belief interpretation are persistent, regardless of intelligence level. Participants in an experiment took the
SAT The SAT ( ) is a standardized test widely used for college admissions in the United States. Since its debut in 1926, its name and scoring have changed several times; originally called the Scholastic Aptitude Test, it was later called the Schol ...
test (a college admissions test used in the United States) to assess their intelligence levels. They then read information regarding safety concerns for vehicles, and the experimenters manipulated the national origin of the car. American participants provided their opinion if the car should be banned on a six-point scale, where one indicated "definitely yes" and six indicated "definitely no". Participants firstly evaluated if they would allow a dangerous German car on American streets and a dangerous American car on German streets. Participants believed that the dangerous German car on American streets should be banned more quickly than the dangerous American car on German streets. There was no difference among intelligence levels at the rate participants would ban a car. Biased interpretation is not restricted to emotionally significant topics. In another experiment, participants were told a story about a theft. They had to rate the evidential importance of statements arguing either for or against a particular character being responsible. When they hypothesized that character's guilt, they rated statements supporting that hypothesis as more important than conflicting statements.


Biased memory recall of information

People may remember evidence selectively to reinforce their expectations, even if they gather and interpret evidence in a neutral manner. This effect is called "selective recall", "confirmatory memory", or "access-biased memory". Psychological theories differ in their predictions about selective recall. Schema theory predicts that information matching prior expectations will be more easily stored and recalled than information that does not match. Some alternative approaches say that surprising information stands out and so is memorable. Predictions from both these theories have been confirmed in different experimental contexts, with no theory winning outright. In one study, participants read a profile of a woman which described a mix of introverted and extroverted behaviors. They later had to recall examples of her introversion and extroversion. One group was told this was to assess the woman for a job as a librarian, while a second group were told it was for a job in real estate sales. There was a significant difference between what these two groups recalled, with the "librarian" group recalling more examples of introversion and the "sales" groups recalling more extroverted behavior. via A selective memory effect has also been shown in experiments that manipulate the desirability of personality types. In one of these, a group of participants were shown evidence that extroverted people are more successful than introverts. Another group were told the opposite. In a subsequent, apparently unrelated study, participants were asked to recall events from their lives in which they had been either introverted or extroverted. Each group of participants provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly. Changes in emotional states can also influence memory recall. Participants rated how they felt when they had first learned that O. J. Simpson had been acquitted of murder charges. They described their emotional reactions and confidence regarding the verdict one week, two months, and one year after the trial. Results indicated that participants' assessments for Simpson's guilt changed over time. The more that participants' opinion of the verdict had changed, the less stable were the participant's memories regarding their initial emotional reactions. When participants recalled their initial emotional reactions two months and a year later, past appraisals closely resembled current appraisals of emotion. People demonstrate sizable myside bias when discussing their opinions on controversial topics. Memory recall and construction of experiences undergo revision in relation to corresponding emotional states. Myside bias has been shown to influence the accuracy of memory recall. In an experiment, widows and widowers rated the intensity of their experienced grief six months and five years after the deaths of their spouses. Participants noted a higher experience of grief at six months rather than at five years. Yet, when the participants were asked after five years how they had felt six months after the death of their significant other, the intensity of grief participants recalled was highly
correlated In statistics, correlation or dependence is any statistical relationship, whether causal or not, between two random variables or bivariate data. Although in the broadest sense, "correlation" may indicate any type of association, in statistics ...
with their current level of grief. Individuals appear to utilize their current emotional states to analyze how they must have felt when experiencing past events. Emotional memories are reconstructed by current emotional states. One study showed how selective memory can maintain belief in
extrasensory perception Extrasensory perception or ESP, also called sixth sense, is a claimed paranormal ability pertaining to reception of information not gained through the recognized physical senses, but sensed with the mind. The term was adopted by Duke Universi ...
(ESP). via Believers and disbelievers were each shown descriptions of ESP experiments. Half of each group were told that the experimental results supported the existence of ESP, while the others were told they did not. In a subsequent test, participants recalled the material accurately, apart from believers who had read the non-supportive evidence. This group remembered significantly less information and some of them incorrectly remembered the results as supporting ESP.


Individual differences

Myside bias was once believed to be correlated with intelligence; however, studies have shown that myside bias can be more influenced by ability to rationally think as opposed to level of intelligence. Myside bias can cause an inability to effectively and logically evaluate the opposite side of an argument. Studies have stated that myside bias is an absence of "active open-mindedness", meaning the active search for why an initial idea may be wrong. Typically, myside bias is operationalized in empirical studies as the quantity of evidence used in support of their side in comparison to the opposite side. A study has found individual differences in myside bias. This study investigates individual differences that are acquired through learning in a cultural context and are mutable. The researcher found important individual difference in argumentation. Studies have suggested that individual differences such as deductive reasoning ability, ability to overcome belief bias, epistemological understanding, and thinking disposition are significant predictors of the reasoning and generating arguments, counterarguments, and rebuttals. A study by Christopher Wolfe and Anne Britt also investigated how participants' views of "what makes a good argument?" can be a source of myside bias that influences the way a person formulates their own arguments. The study investigated individual differences of argumentation schema and asked participants to write essays. The participants were randomly assigned to write essays either for or against their preferred side of an argument and were given research instructions that took either a balanced or an unrestricted approach. The balanced-research instructions directed participants to create a "balanced" argument, i.e., that included both pros and cons; the unrestricted-research instructions included nothing on how to create the argument. Overall, the results revealed that the balanced-research instructions significantly increased the incidence of opposing information in arguments. These data also reveal that personal belief is not a ''source'' of myside bias; however, that those participants, who believe that a good argument is one that is based on facts, are more likely to exhibit myside bias than other participants. This evidence is consistent with the claims proposed in Baron's article—that people's opinions about what makes good thinking can influence how arguments are generated.


Discovery


Informal observations

Before psychological research on confirmation bias, the phenomenon had been observed throughout history. Beginning with the Greek historian
Thucydides Thucydides (; grc, , }; BC) was an Athenian historian and general. His ''History of the Peloponnesian War'' recounts the fifth-century BC war between Sparta and Athens until the year 411 BC. Thucydides has been dubbed the father of "scientifi ...
(c. 460 BC – c. 395 BC), who wrote of misguided reason in ''
The Peloponnesian War The Peloponnesian War (431–404 BC) was an ancient Greek war fought between Athens and Sparta and their respective allies for the hegemony of the Greek world. The war remained undecided for a long time until the decisive intervention of th ...
''; "... for it is a habit of mankind to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy". Italian poet
Dante Alighieri Dante Alighieri (; – 14 September 1321), probably baptized Durante di Alighiero degli Alighieri and often referred to as Dante (, ), was an Italian poet, writer and philosopher. His ''Divine Comedy'', originally called (modern Italian: '' ...
(1265–1321) noted it in the ''
Divine Comedy The ''Divine Comedy'' ( it, Divina Commedia ) is an Italian narrative poem by Dante Alighieri, begun 1308 and completed in around 1321, shortly before the author's death. It is widely considered the pre-eminent work in Italian literature and ...
'', in which
St. Thomas Aquinas Thomas Aquinas, OP (; it, Tommaso d'Aquino, lit=Thomas of Aquino; 1225 – 7 March 1274) was an Italian Dominican friar and priest who was an influential philosopher, theologian and jurist in the tradition of scholasticism; he is known ...
cautions Dante upon meeting in Paradise, "opinion—hasty—often can incline to the wrong side, and then affection for one's own opinion binds, confines the mind".
Ibn Khaldun Ibn Khaldun (; ar, أبو زيد عبد الرحمن بن محمد بن خلدون الحضرمي, ; 27 May 1332 – 17 March 1406, 732-808 AH) was an Arab The Historical Muhammad', Irving M. Zeitlin, (Polity Press, 2007), p. 21; "It is, of ...
noticed the same effect in his ''
Muqaddimah The ''Muqaddimah'', also known as the ''Muqaddimah of Ibn Khaldun'' ( ar, مقدّمة ابن خلدون) or ''Ibn Khaldun's Prolegomena'' ( grc, Προλεγόμενα), is a book written by the Arab The Arabs (singular: Arab; singular ...
'': In the ''
Novum Organum The ''Novum Organum'', fully ''Novum Organum, sive Indicia Vera de Interpretatione Naturae'' ("New organon, or true directions concerning the interpretation of nature") or ''Instaurationis Magnae, Pars II'' ("Part II of The Great Instauration ...
'', English philosopher and scientist
Francis Bacon Francis Bacon, 1st Viscount St Alban (; 22 January 1561 – 9 April 1626), also known as Lord Verulam, was an English philosopher and statesman who served as Attorney General and Lord Chancellor of England. Bacon led the advancement of both ...
(1561–1626). noted that biased assessment of evidence drove "all superstitions, whether in astrology, dreams, omens, divine judgments or the like".Bacon, Francis (1620). ''Novum Organum''. reprinted in via . He wrote: In the second volume of his ''
The World as Will and Representation ''The World as Will and Representation'' (''WWR''; german: Die Welt als Wille und Vorstellung, ''WWV''), sometimes translated as ''The World as Will and Idea'', is the central work of the German philosopher Arthur Schopenhauer. The first edition ...
'' (1844), German philosopher
Arthur Schopenhauer Arthur Schopenhauer ( , ; 22 February 1788 – 21 September 1860) was a German philosopher. He is best known for his 1818 work ''The World as Will and Representation'' (expanded in 1844), which characterizes the phenomenal world as the prod ...
observed that "An adopted hypothesis gives us lynx-eyes for everything that confirms it and makes us blind to everything that contradicts it." In his essay (1897) ''
What Is Art? ''What Is Art?'' (russian: Что такое искусство? ''Chto takoye iskusstvo?'') is a book by Leo Tolstoy. It was completed in Russian in 1897 but first published in English due to difficulties with the Russian censors. Tolstoy cites ...
'', Russian novelist
Leo Tolstoy Count Lev Nikolayevich TolstoyTolstoy pronounced his first name as , which corresponds to the romanization ''Lyov''. () (; russian: link=no, Лев Николаевич Толстой,In Tolstoy's day, his name was written as in pre-refor ...
wrote:Tolstoy, Leo (1896). ''What Is Art?'' ch. 1
p. 143
Translated from Russian by Aylmer Maude, New York, 1904

released 23 March 2021. Retrieved 17 August 2021.
In his essay (1894) ''
The Kingdom of God Is Within You ''The Kingdom of God Is Within You'' ( pre-reform Russian: ; post-reform rus, Царство Божие внутри вас, Tsárstvo Bózhiye vnutrí vas) is a non-fiction book written by Leo Tolstoy. A Christian anarchist philosophical treat ...
'', Tolstoy had earlier written:Tolstoy, Leo (1894). ''The Kingdom of God Is Within You'
p. 49
Translated from Russian by Constance Garnett, New York, 1894

released 26 July 2013. Retrieved 17 August 2021.


Hypothesis-testing (falsification) explanation (Wason)

In Peter Wason's initial experiment published in 1960 (which does not mention the term "confirmation bias"), he repeatedly challenged participants to identify a rule applying to triples of numbers. They were told that (2,4,6) fits the rule. They generated triples, and the experimenter told them whether each triple conformed to the rule. The actual rule was simply "any ascending sequence", but participants had great difficulty in finding it, often announcing rules that were far more specific, such as "the middle number is the average of the first and last". The participants seemed to test only positive examples—triples that obeyed their hypothesized rule. For example, if they thought the rule was, "Each number is two greater than its predecessor," they would offer a triple that fitted (confirmed) this rule, such as (11,13,15) rather than a triple that violated (falsified) it, such as (11,12,19). Wason interpreted his results as showing a preference for confirmation over falsification, hence he coined the term "confirmation bias". Wason also used confirmation bias to explain the results of his selection task experiment. Participants repeatedly performed badly on various forms of this test, in most cases ignoring information that could potentially refute (falsify) the specified rule.


Hypothesis testing (positive test strategy) explanation (Klayman and Ha)

Klayman and Ha's 1987 paper argues that the Wason experiments do not actually demonstrate a bias towards confirmation, but instead a tendency to make tests consistent with the working hypothesis. They called this the "positive test strategy". This strategy is an example of a
heuristic A heuristic (; ), or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, ...
: a reasoning shortcut that is imperfect but easy to compute. Klayman and Ha used
Bayesian probability Bayesian probability is an Probability interpretations, interpretation of the concept of probability, in which, instead of frequentist probability, frequency or propensity probability, propensity of some phenomenon, probability is interpreted as re ...
and
information theory Information theory is the scientific study of the quantification (science), quantification, computer data storage, storage, and telecommunication, communication of information. The field was originally established by the works of Harry Nyquist a ...
as their standard of hypothesis-testing, rather than the falsificationism used by Wason. According to these ideas, each answer to a question yields a different amount of information, which depends on the person's prior beliefs. Thus a scientific test of a hypothesis is one that is expected to produce the most information. Since the information content depends on initial probabilities, a positive test can either be highly informative or uninformative. Klayman and Ha argued that when people think about realistic problems, they are looking for a specific answer with a small initial probability. In this case, positive tests are usually more informative than negative tests. However, in Wason's rule discovery task the answer—three numbers in ascending order—is very broad, so positive tests are unlikely to yield informative answers. Klayman and Ha supported their analysis by citing an experiment that used the labels "DAX" and "MED" in place of "fits the rule" and "doesn't fit the rule". This avoided implying that the aim was to find a low-probability rule. Participants had much more success with this version of the experiment. In light of this and other critiques, the focus of research moved away from confirmation versus falsification of an hypothesis, to examining whether people test hypotheses in an informative way, or an uninformative but positive way. The search for "true" confirmation bias led psychologists to look at a wider range of effects in how people process information.


Information processing explanations

There are currently three main
information processing Information processing is the change (processing) of information in any manner detectable by an observer. As such, it is a process that ''describes'' everything that happens (changes) in the universe, from the falling of a rock (a change in posit ...
explanations of confirmation bias, plus a recent addition.


Cognitive versus motivational

According to
Robert MacCoun Robert J. MacCoun (born October 18, 1958) is the James and Patricia Kowal Professor of Law at Stanford Law School., a Professor by courtesy in Stanford's Psychology Department, and a senior fellow at the Freeman Spogli Institute. Trained as a soci ...
, most biased evidence processing occurs through a combination of "cold" (cognitive) and "hot" (motivated) mechanisms. Cognitive explanations for confirmation bias are based on limitations in people's ability to handle complex tasks, and the shortcuts, called ''
heuristics A heuristic (; ), or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, ...
'', that they use. For example, people may judge the reliability of evidence by using the ''
availability heuristic The availability heuristic, also known as availability bias, is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method, or decision. This heuristic, operating on the ...
'' that is, how readily a particular idea comes to mind. It is also possible that people can only focus on one thought at a time, so find it difficult to test alternative hypotheses in parallel. Another heuristic is the positive test strategy identified by Klayman and Ha, in which people test a hypothesis by examining cases where they expect a property or event to occur. This heuristic avoids the difficult or impossible task of working out how diagnostic each possible question will be. However, it is not universally reliable, so people can overlook challenges to their existing beliefs. Motivational explanations involve an effect of
desire Desires are states of mind that are expressed by terms like "wanting", "wishing", "longing" or "craving". A great variety of features is commonly associated with desires. They are seen as propositional attitudes towards conceivable states of aff ...
on
belief A belief is an attitude that something is the case, or that some proposition is true. In epistemology, philosophers use the term "belief" to refer to attitudes about the world which can be either true or false. To believe something is to take i ...
. It is known that people prefer positive thoughts over negative ones in a number of ways: this is called the "
Pollyanna principle The Pollyanna principle (also called Pollyannaism or positivity bias) is the tendency for people to remember pleasant items more accurately than unpleasant ones. Research indicates that at the subconscious level, the mind tends to focus on the opti ...
". Applied to
argument An argument is a statement or group of statements called premises intended to determine the degree of truth or acceptability of another statement called conclusion. Arguments can be studied from three main perspectives: the logical, the dialectic ...
s or sources of
evidence Evidence for a proposition is what supports this proposition. It is usually understood as an indication that the supported proposition is true. What role evidence plays and how it is conceived varies from field to field. In epistemology, evidenc ...
, this could explain why desired conclusions are more likely to be believed true. According to experiments that manipulate the desirability of the conclusion, people demand a high standard of evidence for unpalatable ideas and a low standard for preferred ideas. In other words, they ask, "Can I believe this?" for some suggestions and, "Must I believe this?" for others. Although
consistency In classical deductive logic, a consistent theory is one that does not lead to a logical contradiction. The lack of contradiction can be defined in either semantic or syntactic terms. The semantic definition states that a theory is consistent ...
is a desirable feature of attitudes, an excessive drive for consistency is another potential source of bias because it may prevent people from neutrally evaluating new, surprising information. Social psychologist
Ziva Kunda Ziva Kunda (June 13, 1955 – February 24, 2004) was an Israeli social psychologist and professor at the University of Waterloo known for her work in social cognition and motivated reasoning. Her seminal paper "The Case for Motivated Reasoning", p ...
combines the cognitive and motivational theories, arguing that motivation creates the bias, but cognitive factors determine the size of the effect.


Cost-benefit

Explanations in terms of cost-benefit analysis assume that people do not just test hypotheses in a disinterested way, but assess the costs of different errors. Using ideas from
evolutionary psychology Evolutionary psychology is a theoretical approach in psychology that examines cognition and behavior from a modern evolutionary perspective. It seeks to identify human psychological adaptations with regards to the ancestral problems they evolv ...
, James Friedrich suggests that people do not primarily aim at
truth Truth is the property of being in accord with fact or reality.Merriam-Webster's Online Dictionarytruth 2005 In everyday language, truth is typically ascribed to things that aim to represent reality or otherwise correspond to it, such as beliefs ...
in testing hypotheses, but try to avoid the most costly errors. For example, employers might ask one-sided questions in job interviews because they are focused on weeding out unsuitable candidates.
Yaacov Trope Yaacov Trope (b. June 17, 1945) is a social psychologist who studies cognitive, motivational, and social factors that enable perspective taking, and effects of emotions and desires on social judgment and decision making. He is a Professor of ...
and Akiva Liberman's refinement of this theory assumes that people compare the two different kinds of error: accepting a false hypothesis or rejecting a true hypothesis. For instance, someone who underestimates a friend's honesty might treat him or her suspiciously and so undermine the friendship. Overestimating the friend's honesty may also be costly, but less so. In this case, it would be rational to seek, evaluate or remember evidence of their honesty in a biased way. When someone gives an initial impression of being introverted or extroverted, questions that match that impression come across as more
empathic Empathy is the capacity to understand or feel what another person is experiencing from within their frame of reference, that is, the capacity to place oneself in another's position. Definitions of empathy encompass a broad range of social, cog ...
. This suggests that when talking to someone who seems to be an introvert, it is a sign of better
social skills A social skill is any competence facilitating interaction and communication with others where social rules and relations are created, communicated, and changed in verbal and nonverbal ways. The process of learning these skills is called social ...
to ask, "Do you feel awkward in social situations?" rather than, "Do you like noisy parties?" The connection between confirmation bias and social skills was corroborated by a study of how college students get to know other people. Highly
self-monitoring Self-monitoring, a concept introduced in the 1970s by Mark Snyder (psychologist), Mark Snyder, describes the extent to which people monitor their self-presentations, expressive behavior, and nonverbal affective displays. Snyder held that human bei ...
students, who are more sensitive to their environment and to
social norms Social norms are shared standards of acceptable behavior by groups. Social norms can both be informal understandings that govern the behavior of members of a society, as well as be codified into rules and laws. Social normative influences or soci ...
, asked more matching questions when interviewing a high-status staff member than when getting to know fellow students.


Exploratory versus confirmatory

Psychologists
Jennifer Lerner Jennifer S. Lerner is an American experimental social psychologist known for her research in emotion and decision theory. She is the first psychologist at the Harvard Kennedy School to receive tenure. At Harvard, her titles include Professor of P ...
and
Philip Tetlock Philip E. Tetlock (born 1954) is a Canadian-American political science writer, and is currently the Annenberg University Professor at the University of Pennsylvania, where he is cross-appointed at the Wharton School and the School of Arts and Sc ...
distinguish two different kinds of thinking process. '' Exploratory thought'' neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position, while ''confirmatory thought'' seeks to justify a specific point of view. Lerner and Tetlock say that when people expect to justify their position to others whose views they already know, they will tend to adopt a similar position to those people, and then use confirmatory thought to bolster their own credibility. However, if the external parties are overly aggressive or critical, people will disengage from thought altogether, and simply assert their personal opinions without justification. Lerner and Tetlock say that people only push themselves to think critically and logically when they know in advance they will need to explain themselves to others who are well-informed, genuinely interested in the truth, and whose views they do not already know. Because those conditions rarely exist, they argue, most people are using confirmatory thought most of the time.


Make-believe

Developmental psychologist Eve Whitmore has argued that beliefs and biases involved in confirmation bias have their roots in childhood coping through make-believe, which becomes "the basis for more complex forms of self-deception and illusion into adulthood." The friction brought on by questioning as an adolescent with developing critical thinking can lead to the rationalization of false beliefs, and the habit of such rationalization can become unconscious over the years.


Real-world effects


Social media

In
social media Social media are interactive media technologies that facilitate the creation and sharing of information, ideas, interests, and other forms of expression through virtual communities and networks. While challenges to the definition of ''social medi ...
, confirmation bias is amplified by the use of
filter bubble A filter bubble or ideological frame is a state of intellectual isolationTechnopediaDefinition – What does Filter Bubble mean?, Retrieved October 10, 2017, "....A filter bubble is the intellectual isolation, that can occur when websites make us ...
s, or "algorithmic editing", which displays to individuals only information they are likely to agree with, while excluding opposing views. Some have argued that confirmation bias is the reason why society can never escape from filter bubbles, because individuals are psychologically hardwired to seek information that agrees with their preexisting values and beliefs. Others have further argued that the mixture of the two is degrading
democracy Democracy (From grc, δημοκρατία, dēmokratía, ''dēmos'' 'people' and ''kratos'' 'rule') is a form of government in which the people have the authority to deliberate and decide legislation (" direct democracy"), or to choose gov ...
—claiming that this "algorithmic editing" removes diverse viewpoints and information—and that unless filter bubble algorithms are removed, voters will be unable to make fully informed political decisions. The rise of social media has contributed greatly to the rapid spread of
fake news Fake news is false or misleading information presented as news. Fake news often has the aim of damaging the reputation of a person or entity, or making money through advertising revenue.Schlesinger, Robert (April 14, 2017)"Fake news in reality ...
, that is, false and misleading information that is presented as credible news from a seemingly reliable source. Confirmation bias (selecting or reinterpreting evidence to support one's beliefs) is one of three main hurdles cited as to why critical thinking goes astray in these circumstances. The other two are shortcut heuristics (when overwhelmed or short of time, people rely on simple rules such as group consensus or trusting an expert or role model) and social goals (social motivation or peer pressure can interfere with objective analysis of facts at hand). In combating the spread of fake news, social media sites have considered turning toward "digital nudging". This can currently be done in two different forms of nudging. This includes nudging of information and nudging of presentation. Nudging of information entails social media sites providing a disclaimer or label questioning or warning users of the validity of the source while nudging of presentation includes exposing users to new information which they may not have sought out but could introduce them to viewpoints that may combat their own confirmation biases.


Science and scientific research

A distinguishing feature of
scientific thinking The scientific method is an empirical method for acquiring knowledge that has characterized the development of science since at least the 17th century (with notable practitioners in previous centuries; see the article history of scientific me ...
is the search for confirming or supportive evidence (
inductive reasoning Inductive reasoning is a method of reasoning in which a general principle is derived from a body of observations. It consists of making broad generalizations based on specific observations. Inductive reasoning is distinct from ''deductive'' re ...
) as well as falsifying evidence (
deductive reasoning Deductive reasoning is the mental process of drawing deductive inferences. An inference is deductively valid if its conclusion follows logically from its premises, i.e. if it is impossible for the premises to be true and the conclusion to be fals ...
). Inductive research in particular can have a serious problem with confirmation bias. Many times in the
history of science The history of science covers the development of science from ancient times to the present. It encompasses all three major branches of science: natural, social, and formal. Science's earliest roots can be traced to Ancient Egypt and Meso ...
, scientists have resisted new discoveries by selectively interpreting or ignoring unfavorable data. Several studies have shown that scientists rate studies that report findings consistent with their prior beliefs more favorably than studies reporting findings inconsistent with their previous beliefs. However, assuming that the research question is relevant, the experimental design adequate and the data are clearly and comprehensively described, the empirical data obtained should be important to the scientific community and should not be viewed prejudicially, regardless of whether they conform to current theoretical predictions. In practice, researchers may misunderstand, misinterpret, or not read at all studies that contradict their preconceptions, or wrongly cite them anyway as if they actually supported their claims. Further, confirmation biases can sustain scientific theories or research programs in the face of inadequate or even contradictory evidence. The discipline of
parapsychology Parapsychology is the study of alleged psychic phenomena (extrasensory perception, telepathy, precognition, clairvoyance, psychokinesis (also called telekinesis), and psychometry) and other paranormal claims, for example, those related to near ...
is often cited as an example in the context of whether it is a
protoscience __NOTOC__ In the philosophy of science, there are several definitions of protoscience. Its simplest meaning (most closely reflecting its roots of '' proto-'' + ''science'') involves the earliest eras of the history of science, when the scientific m ...
or a pseudoscience. An experimenter's confirmation bias can potentially affect which data are reported. Data that conflict with the experimenter's expectations may be more readily discarded as unreliable, producing the so-called file drawer effect. To combat this tendency, scientific training teaches ways to prevent bias. For example,
experimental design The design of experiments (DOE, DOX, or experimental design) is the design of any task that aims to describe and explain the variation of information under conditions that are hypothesized to reflect the variation. The term is generally associ ...
of
randomized controlled trial A randomized controlled trial (or randomized control trial; RCT) is a form of scientific experiment used to control factors not under direct experimental control. Examples of RCTs are clinical trials that compare the effects of drugs, surgical te ...
s (coupled with their
systematic review A systematic review is a Literature review, scholarly synthesis of the evidence on a clearly presented topic using critical methods to identify, define and assess research on the topic. A systematic review extracts and interprets data from publ ...
) aims to minimize sources of bias. The social process of
peer review Peer review is the evaluation of work by one or more people with similar competencies as the producers of the work (peers). It functions as a form of self-regulation by qualified members of a profession within the relevant field. Peer review ...
aims to mitigate the effect of individual scientists' biases, even though the peer review process itself may be susceptible to such biasesBartlett, Steven James, "The psychology of abuse in publishing: Peer review and editorial bias," Chap. 7, pp. 147–177, in
Steven James Bartlett Steven James Bartlett (born 1945) is an American philosopher and psychologist notable for his studies in epistemology and the theory of reflexivity, and for his work on the psychology of human aggression and destructiveness, and the shortcoming ...
, ''Normality does not equal mental health: The need to look elsewhere for standards of good psychological health''. Santa Barbara, CA: Praeger, 2011.
Confirmation bias may thus be especially harmful to objective evaluations regarding nonconforming results since biased individuals may regard opposing evidence to be weak in principle and give little serious thought to revising their beliefs. Scientific innovators often meet with resistance from the scientific community, and research presenting controversial results frequently receives harsh peer review.


Finance

Confirmation bias can lead investors to be overconfident, ignoring evidence that their strategies will lose money. In studies of political stock markets, investors made more profit when they resisted bias. For example, participants who interpreted a candidate's debate performance in a neutral rather than partisan way were more likely to profit. To combat the effect of confirmation bias, investors can try to adopt a contrary viewpoint "for the sake of argument". In one technique, they imagine that their investments have collapsed and ask themselves why this might happen.


Medicine and health

Cognitive biases are important variables in clinical decision-making by medical general practitioners (GPs) and medical specialists. Two important ones are confirmation bias and the overlapping availability bias. A GP may make a diagnosis early on during an examination, and then seek confirming evidence rather than falsifying evidence. This cognitive error is partly caused by the availability of evidence about the supposed disorder being diagnosed. For example, the client may have mentioned the disorder, or the GP may have recently read a much-discussed paper about the disorder. The basis of this cognitive shortcut or heuristic (termed anchoring) is that the doctor does not consider multiple possibilities based on evidence, but prematurely latches on (or anchors to) a single cause. In emergency medicine, because of time pressure, there is a high density of decision-making, and shortcuts are frequently applied. The potential failure rate of these cognitive decisions needs to be managed by education about the 30 or more cognitive biases that can occur, so as to set in place proper debiasing strategies.. Confirmation bias may also cause doctors to perform unnecessary medical procedures due to pressure from adamant patients. Raymond Nickerson, a psychologist, blames confirmation bias for the ineffective medical procedures that were used for centuries before the arrival of scientific medicine. If a patient recovered, medical authorities counted the treatment as successful, rather than looking for alternative explanations such as that the disease had run its natural course. Biased assimilation is a factor in the modern appeal of
alternative medicine Alternative medicine is any practice that aims to achieve the healing effects of medicine despite lacking biological plausibility, testability, repeatability, or evidence from clinical trials. Complementary medicine (CM), complementary and alt ...
, whose proponents are swayed by positive
anecdotal evidence Anecdotal evidence is evidence based only on personal observation, collected in a casual or non-systematic manner. The term is sometimes used in a legal context to describe certain kinds of testimony which are uncorroborated by objective, independ ...
but treat
scientific evidence Scientific evidence is evidence that serves to either support or counter a scientific theory or hypothesis, although scientists also use evidence in other ways, such as when applying theories to practical problems. "Discussions about empirical ev ...
hyper-critically.
Cognitive therapy Cognitive therapy (CT) is a type of psychotherapy developed by American psychiatrist Aaron T. Beck. CT is one therapeutic approach within the larger group of cognitive behavioral therapies (CBT) and was first expounded by Beck in the 1960s. Cogn ...
was developed by
Aaron T. Beck Aaron Temkin Beck (July 18, 1921 – November 1, 2021) was an American psychiatrist who was a professor in the department of psychiatry at the University of Pennsylvania.
in the early 1960s and has become a popular approach. According to Beck, biased information processing is a factor in depression. His approach teaches people to treat evidence impartially, rather than selectively reinforcing negative outlooks.
Phobias A phobia is an anxiety disorder defined by a persistent and excessive fear of an object or situation. Phobias typically result in a rapid onset of fear and are usually present for more than six months. Those affected go to great lengths to avoi ...
and
hypochondria Hypochondriasis or hypochondria is a condition in which a person is excessively and unduly worried about having a serious illness. An old concept, the meaning of hypochondria has repeatedly changed. It has been claimed that this debilitating cond ...
have also been shown to involve confirmation bias for threatening information.


Politics, law and policing

Nickerson argues that reasoning in judicial and political contexts is sometimes subconsciously biased, favoring conclusions that judges, juries or governments have already committed to. Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on, it is reasonable to expect an attitude polarization effect. The prediction that jurors will become more extreme in their views as they see more evidence has been borne out in experiments with
mock trial A mock trial is an act or imitation trial. It is similar to a moot court, but mock trials simulate lower-court trials, while moot court simulates appellate court hearings. Attorneys preparing for a real trial might use a mock trial consisting ...
s. Both
inquisitorial An inquisitorial system is a legal system in which the court, or a part of the court, is actively involved in investigating the facts of the case. This is distinct from an adversarial system, in which the role of the court is primarily that of an ...
and adversarial criminal justice systems are affected by confirmation bias. Confirmation bias can be a factor in creating or extending conflicts, from emotionally charged debates to wars: by interpreting the evidence in their favor, each opposing party can become overconfident that it is in the stronger position. On the other hand, confirmation bias can result in people ignoring or misinterpreting the signs of an imminent or incipient conflict. For example, psychologists Stuart Sutherland and Thomas Kida have each argued that
U.S. Navy The United States Navy (USN) is the maritime service branch of the United States Armed Forces and one of the eight uniformed services of the United States. It is the largest and most powerful navy in the world, with the estimated tonnage o ...
Admiral
Husband E. Kimmel Husband Edward Kimmel (February 26, 1882 – May 14, 1968) was a United States Navy four-star admiral who was the commander in chief of the United States Pacific Fleet (CINCPACFLT) during the Japanese attack on Pearl Harbor. He was removed fr ...
showed confirmation bias when playing down the first signs of the Japanese
attack on Pearl Harbor The attack on Pearl HarborAlso known as the Battle of Pearl Harbor was a surprise military strike by the Imperial Japanese Navy Air Service upon the United States against the naval base at Pearl Harbor in Honolulu, Territory of Hawaii, j ...
. A two-decade study of political pundits by
Philip E. Tetlock Philip E. Tetlock (born 1954) is a Canadian-American political science writer, and is currently the Annenberg University Professor at the University of Pennsylvania, where he is cross-appointed at the Wharton School and the School of Arts and Sc ...
found that, on the whole, their predictions were not much better than chance. Tetlock divided experts into "foxes" who maintained multiple hypotheses, and "hedgehogs" who were more dogmatic. In general, the hedgehogs were much less accurate. Tetlock blamed their failure on confirmation bias, and specifically on their inability to make use of new information that contradicted their existing theories. In police investigations, a detective may identify a suspect early in an investigation, but then sometimes largely seek supporting or confirming evidence, ignoring or downplaying falsifying evidence.


Social psychology

Social psychologists have identified two tendencies in the way people seek or interpret information about themselves. ''
Self-verification Self-verification is a social psychological theory that asserts people want to be known and understood by others according to their firmly held beliefs and feelings about themselves, that is ''self-views'' (including self-concepts and self-est ...
'' is the drive to reinforce the existing
self-image Self-image is the mental picture, generally of a kind that is quite resistant to change, that depicts not only details that are potentially available to an objective investigation by others (height, weight, hair color, etc.), but also items that h ...
and ''
self-enhancement Self-enhancement is a type of motivation that works to make people feel good about themselves and to maintain self-esteem. This motive becomes especially prominent in situations of threat, failure or blows to one's self-esteem. Self-enhancement inv ...
'' is the drive to seek positive feedback. Both are served by confirmation biases. In experiments where people are given feedback that conflicts with their self-image, they are less likely to attend to it or remember it than when given self-verifying feedback. They reduce the impact of such information by interpreting it as unreliable. Similar experiments have found a preference for positive feedback, and the people who give it, over negative feedback.


Mass delusions

Confirmation bias can play a key role in the propagation of
mass delusion In sociology and psychology, mass hysteria is a phenomenon that transmits collective illusions of threats, whether real or imaginary, through a population and society as a result of rumors and fear.Bartholomew, Robert E. (2001). ''Little Green Me ...
s.
Witch trial A witch-hunt, or a witch purge, is a search for people who have been labeled witches or a search for evidence of witchcraft. The classical period of witch-hunts in Early Modern Europe and Colonial America took place in the Early Modern perio ...
s are frequently cited as an example. For another example, in the Seattle windshield pitting epidemic, there seemed to be a "pitting epidemic" in which windshields were damaged due to an unknown cause. As news of the apparent wave of damage spread, more and more people checked their windshields, discovered that their windshields too had been damaged, thus confirming belief in the supposed epidemic. In fact, the windshields were previously damaged, but the damage went unnoticed until people checked their windshields as the delusion spread.


Paranormal beliefs

One factor in the appeal of alleged
psychic A psychic is a person who claims to use extrasensory perception (ESP) to identify information hidden from the normal senses, particularly involving telepathy or clairvoyance, or who performs acts that are apparently inexplicable by natural laws, ...
readings is that listeners apply a confirmation bias which fits the psychic's statements to their own lives. By making a large number of ambiguous statements in each sitting, the psychic gives the client more opportunities to find a match. This is one of the techniques of cold reading, with which a psychic can deliver a subjectively impressive reading without any prior information about the client. Investigator
James Randi James Randi (born Randall James Hamilton Zwinge; August 7, 1928 – October 20, 2020) was a Canadian-American stage magician, author and scientific skeptic who extensively challenged paranormal and pseudoscientific claims. Rodrigues 2010p. ...
compared the transcript of a reading to the client's report of what the psychic had said, and found that the client showed a strong selective recall of the "hits". As a striking illustration of confirmation bias in the real world, Nickerson mentions numerological
pyramidology Pyramidology (or pyramidism) refers to various religious or pseudoscientific speculations regarding pyramids, most often the Giza pyramid complex and the Great Pyramid of Giza in Egypt. Martin Gardner, '' Fads and Fallacies in the Name of S ...
: the practice of finding meaning in the proportions of the Egyptian pyramids. There are many different length measurements that can be made of, for example, the
Great Pyramid of Giza The Great Pyramid of Giza is the biggest Egyptian pyramid and the tomb of Fourth Dynasty pharaoh Khufu. Built in the early 26th century BC during a period of around 27 years, the pyramid is the oldest of the Seven Wonders of the Ancient World, ...
and many ways to combine or manipulate them. Hence it is almost inevitable that people who look at these numbers selectively will find superficially impressive correspondences, for example with the dimensions of the Earth.


Recruitment and selection

Unconscious cognitive bias (including confirmation bias) in job recruitment affects hiring decisions and can potentially prohibit a diverse and inclusive workplace. There are a variety of unconscious biases that affects recruitment decisions but confirmation bias is one of the major ones, especially during the interview stage. The interviewer will often select a candidate that confirms their own beliefs, even though other candidates are equally or better qualified.


Associated effects and outcomes


Polarization of opinion

When people with opposing views interpret new information in a biased way, their views can move even further apart. This is called "attitude polarization". The effect was demonstrated by an experiment that involved drawing a series of red and black balls from one of two concealed "bingo baskets". Participants knew that one basket contained 60 percent black and 40 percent red balls; the other, 40 percent black and 60 percent red. The experimenters looked at what happened when balls of alternating color were drawn in turn, a sequence that does not favor either basket. After each ball was drawn, participants in one group were asked to state out loud their judgments of the probability that the balls were being drawn from one or the other basket. These participants tended to grow more confident with each successive draw—whether they initially thought the basket with 60 percent black balls or the one with 60 percent red balls was the more likely source, their estimate of the probability increased. Another group of participants were asked to state probability estimates only at the end of a sequence of drawn balls, rather than after each ball. They did not show the polarization effect, suggesting that it does not necessarily occur when people simply hold opposing positions, but rather when they openly commit to them. A less abstract study was the Stanford biased interpretation experiment, in which participants with strong opinions about the death penalty read about mixed experimental evidence. Twenty-three percent of the participants reported that their views had become more extreme, and this self-reported shift correlated strongly with their initial attitudes. In later experiments, participants also reported their opinions becoming more extreme in response to ambiguous information. However, comparisons of their attitudes before and after the new evidence showed no significant change, suggesting that the self-reported changes might not be real. Based on these experiments, Deanna Kuhn and Joseph Lao concluded that polarization is a real phenomenon but far from inevitable, only happening in a small minority of cases, and it was prompted not only by considering mixed evidence, but by merely thinking about the topic. Charles Taber and Milton Lodge argued that the Stanford team's result had been hard to replicate because the arguments used in later experiments were too abstract or confusing to evoke an emotional response. The Taber and Lodge study used the emotionally charged topics of
gun control Gun control, or firearms regulation, is the set of laws or policies that regulate the manufacture, sale, transfer, possession, modification, or use of firearms by civilians. Most countries have a restrictive firearm guiding policy, with on ...
and affirmative action. They measured the attitudes of their participants towards these issues before and after reading arguments on each side of the debate. Two groups of participants showed attitude polarization: those with strong prior opinions and those who were politically knowledgeable. In part of this study, participants chose which information sources to read, from a list prepared by the experimenters. For example, they could read the
National Rifle Association The National Rifle Association of America (NRA) is a gun rights advocacy group based in the United States. Founded in 1871 to advance rifle marksmanship, the modern NRA has become a prominent Gun politics in the United States, gun rights ...
's and the Brady Anti-Handgun Coalition's arguments on gun control. Even when instructed to be even-handed, participants were more likely to read arguments that supported their existing attitudes than arguments that did not. This biased search for information correlated well with the polarization effect. The is a name for the finding that given evidence against their beliefs, people can reject the evidence and believe even more strongly. The phrase was coined by
Brendan Nyhan Brendan Nyhan (; born 1978) is an American political scientist and professor at Dartmouth College. He is also a liberal to moderate political blogger, author, and political columnist. He was born in Mountain View, California and now lives in Han ...
and Jason Reifler in 2010. However, subsequent research has since failed to replicate findings supporting the backfire effect. One study conducted out of the Ohio State University and George Washington University studied 10,100 participants with 52 different issues expected to trigger a backfire effect. While the findings did conclude that individuals are reluctant to embrace facts that contradict their already held ideology, no cases of backfire were detected. The backfire effect has since been noted to be a rare phenomenon rather than a common occurrence (compare the boomerang effect).


Persistence of discredited beliefs

Confirmation biases provide one plausible explanation for the persistence of beliefs when the initial evidence for them is removed or when they have been sharply contradicted. This belief perseverance effect has been first demonstrated experimentally by
Festinger Festinger is a surname. Notable people with the surname include: * Richard Festinger (born 1948), American composer * Leon Festinger Leon Festinger (8 May 1919 – 11 February 1989) was an American social psychologist who originated the theor ...
, Riecken, and Schachter. These psychologists spent time with a cult whose members were convinced that the world would
end End, END, Ending, or variation, may refer to: End *In mathematics: ** End (category theory) ** End (topology) **End (graph theory) ** End (group theory) (a subcase of the previous) **End (endomorphism) *In sports and games **End (gridiron footbal ...
on 21 December 1954. After the prediction failed, most believers still clung to their faith. Their book describing this research is aptly named ''
When Prophecy Fails ''When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World'' is a classic work of social psychology by Leon Festinger, Henry Riecken, and Stanley Schachter, published in 1956, detailing a ...
''. The term ''belief perseverance'', however, was coined in a series of experiments using what is called the "debriefing paradigm": participants read fake evidence for a hypothesis, their
attitude change Attitudes are associated beliefs and behaviors towards some object. They are not stable, and because of the communication and behavior of other people, are subject to change by social influences, as well as by the individual's motivation to maint ...
is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level. .
A common finding is that at least some of the initial belief remains even after a full debriefing. In one experiment, participants had to distinguish between real and fake suicide notes. The feedback was random: some were told they had done well while others were told they had performed badly. Even after being fully debriefed, participants were still influenced by the feedback. They still thought they were better or worse than average at that kind of task, depending on what they had initially been told. In another study, participants read
job performance Job performance assesses whether a person performs a job well. Job performance, studied academically as part of industrial and organizational psychology, also forms a part of human resources management. Performance is an important criterion for org ...
ratings of two firefighters, along with their responses to a
risk aversion In economics and finance, risk aversion is the tendency of people to prefer outcomes with low uncertainty to those outcomes with high uncertainty, even if the average outcome of the latter is equal to or higher in monetary value than the more ce ...
test. This fictional data was arranged to show either a negative or positive association: some participants were told that a risk-taking firefighter did better, while others were told they did less well than a risk-averse colleague. Even if these two case studies were true, they would have been scientifically poor evidence for a conclusion about firefighters in general. However, the participants found them subjectively persuasive. When the case studies were shown to be fictional, participants' belief in a link diminished, but around half of the original effect remained. Follow-up interviews established that the participants had understood the debriefing and taken it seriously. Participants seemed to trust the debriefing, but regarded the discredited information as irrelevant to their personal belief. The continued influence effect is the tendency for misinformation to continue to influence memory and reasoning about an event, despite the misinformation having been retracted or corrected. This occurs even when the individual believes the correction.


Preference for early information

Experiments have shown that information is weighted more strongly when it appears early in a series, even when the order is unimportant. For example, people form a more positive impression of someone described as "intelligent, industrious, impulsive, critical, stubborn, envious" than when they are given the same words in reverse order. This ''irrational primacy effect'' is independent of the primacy effect in memory in which the earlier items in a series leave a stronger memory trace. Biased interpretation offers an explanation for this effect: seeing the initial evidence, people form a working hypothesis that affects how they interpret the rest of the information. One demonstration of irrational primacy used colored chips supposedly drawn from two urns. Participants were told the color distributions of the urns, and had to estimate the probability of a chip being drawn from one of them. In fact, the colors appeared in a prearranged order. The first thirty draws favored one urn and the next thirty favored the other. The series as a whole was neutral, so rationally, the two urns were equally likely. However, after sixty draws, participants favored the urn suggested by the initial thirty. Another experiment involved a slide show of a single object, seen as just a blur at first and in slightly better focus with each succeeding slide. After each slide, participants had to state their best guess of what the object was. Participants whose early guesses were wrong persisted with those guesses, even when the picture was sufficiently in focus that the object was readily recognizable to other people.


Illusory association between events

Illusory correlation is the tendency to see non-existent correlations in a set of data. This tendency was first demonstrated in a series of experiments in the late 1960s. In one experiment, participants read a set of psychiatric case studies, including responses to the
Rorschach inkblot test The Rorschach test is a projective psychological test in which subjects' perceptions of inkblots are recorded and then analyzed using psychological interpretation, complex algorithms, or both. Some psychologists use this test to examine a pe ...
. The participants reported that the homosexual men in the set were more likely to report seeing buttocks, anuses or sexually ambiguous figures in the inkblots. In fact the fictional case studies had been constructed so that the homosexual men were no more likely to report this imagery or, in one version of the experiment, were less likely to report it than heterosexual men. In a survey, a group of experienced psychoanalysts reported the same set of illusory associations with homosexuality. Another study recorded the symptoms experienced by arthritic patients, along with weather conditions over a 15-month period. Nearly all the patients reported that their pains were correlated with weather conditions, although the real correlation was zero. via This effect is a kind of biased interpretation, in that objectively neutral or unfavorable evidence is interpreted to support existing beliefs. It is also related to biases in hypothesis-testing behavior. In judging whether two events, such as illness and bad weather, are correlated, people rely heavily on the number of ''positive-positive'' cases: in this example, instances of both pain and bad weather. They pay relatively little attention to the other kinds of observation (of no pain and/or good weather). This parallels the reliance on positive tests in hypothesis testing. It may also reflect selective recall, in that people may have a sense that two events are correlated because it is easier to recall times when they happened together.


See also

*
Apophenia Apophenia () is the tendency to perceive meaningful connections between unrelated things. The term (German: ' from the Greek verb ''ἀποφαίνειν'' (apophaínein)) was coined by psychiatrist Klaus Conrad in his 1958 publication on the b ...
*
Cherry picking Cherry picking, suppressing evidence, or the fallacy of incomplete evidence is the act of pointing to individual cases or data that seem to confirm a particular position while ignoring a significant portion of related and similar cases or data th ...
*
Circular source Circular reporting, or false confirmation, is a situation in source criticism where a piece of information appears to come from multiple independent sources, but in reality comes from only one source. In many cases, the problem happens mistakenl ...
*
Compartmentalization (psychology) Compartmentalization is a form of psychological defense mechanism in which thoughts and feelings that seem to conflict are kept separated or isolated from each other in the mind. It may be a form of mild dissociation; example scenarios that sugge ...
*
Cognitive bias mitigation Cognitive bias mitigation is the prevention and reduction of the negative effects of cognitive biases – unconscious, automatic influences on human judgment and decision making that reliably produce reasoning errors. Coherent, comprehensive th ...
*
Cognitive inertia Cognitive inertia is the tendency for a particular orientation in how an individual thinks about an issue, belief or strategy to resist change. In clinical and neuroscientific literature it is often defined as a lack of motivation to generate dist ...
*
Cognitive miser In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. See ...
*
Denial Denial, in ordinary English usage, has at least three meanings: asserting that any particular statement or allegation is not true (which might be accurate or inaccurate); the refusal of a request; and asserting that a true statement is not true. ...
*
Denialism In the psychology of human behavior, denialism is a person's choice to deny reality as a way to avoid a psychologically uncomfortable truth. Denialism is an essentially irrational action that withholds the validation of a historical experience ...
*
Echo chamber (media) In discussions of news media, an echo chamber refers to situations in which beliefs are amplified or reinforced by communication and repetition inside a closed system and insulated from rebuttal. By participating in an echo chamber, people are a ...
*
Fallacy A fallacy is the use of invalid or otherwise faulty reasoning, or "wrong moves," in the construction of an argument which may appear stronger than it really is if the fallacy is not spotted. The term in the Western intellectual tradition was intr ...
*
False consensus effect In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to “see their own behavioral choices and judgments as relatively common and appropriate to existing circumstances”. In ot ...
*
Groupthink Groupthink is a psychological phenomenon that occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome. Cohesiveness, or the desire for cohesiveness ...
*
Hostile media effect The hostile media effect, originally deemed the hostile media phenomenon and sometimes called hostile media perception, is a perceptual theory of mass communication that refers to the tendency for individuals with a strong preexisting attitude on ...
*
Idée fixe (psychology) In psychology, an ''idée fixe'' is a preoccupation of mind believed to be firmly resistant to any attempt to modify it, a fixation. The name originates from the French ''idée'' , "idea" and ''fixe'' , "fixed." Background The initial intro ...
*
Illusory truth effect The illusory truth effect (also known as the illusion of truth effect, validity effect, truth effect, or the reiteration effect) is the tendency to believe false information to be correct after repeated exposure. This phenomenon was first identif ...
*
Inoculation theory Inoculation theory is a social psychological/communication theory that explains how an attitude or belief can be protected against persuasion or influence in much the same way a body can be protected against disease–for example, through pre-exp ...
*
List of cognitive biases Cognitive biases are systematic patterns of deviation from norm and/or rationality in judgment. They are often studied in psychology, sociology and behavioral economics. Although the reality of most of these biases is confirmed by reproducible ...
*
Observer-expectancy effect The observer-expectancy effect (also called the experimenter-expectancy effect, expectancy bias, observer effect, or experimenter effect) is a form of reactivity in which a researcher's cognitive bias causes them to subconsciously influence t ...
* Post truth *
Selective perception Selective perception is the tendency not to notice and more quickly forget stimuli that cause emotional discomfort and contradict our prior beliefs. For example, a teacher may have a favorite student because they are biased by in-group favoritism. ...
*
Semmelweis reflex The Semmelweis reflex or "Semmelweis effect" is a metaphor for the reflex-like tendency to reject new evidence or new knowledge because it contradicts established norms, beliefs, or paradigms. The term derives from the name of Ignaz Semmelweis, a ...
*
Woozle effect The Woozle effect, also known as evidence by citation, occurs when a source is widely cited for a claim it does not adequately support, giving said claim undeserved credibility. If replication studies are not done and no one notices that a key ...


Notes


References


Citations


Sources

* * * * * * * * * * * * * * * * * *


Further reading

* * * *


External links


Skeptic's Dictionary: confirmation bias
– Robert T. Carroll

– class handout and instructor's notes by K.H. Grobman
Confirmation bias at You Are Not So Smart

Confirmation bias learning object
– interactive number triples exercise by Rod McFarland for Simon Fraser University

– Keith Rollag, Babson College {{Authority control Barriers to critical thinking Cognitive biases Cognitive inertia Design of experiments Error Fallacies Ignorance Inductive fallacies Misuse of statistics Memory biases Psychological concepts