HOME

TheInfoList



OR:

Automation bias is the propensity for humans to favor suggestions from automated decision-making systems and to ignore contradictory information made without automation, even if it is correct. Automation bias stems from the
social psychology Social psychology is the scientific study of how thoughts, feelings, and behaviors are influenced by the real or imagined presence of other people or by social norms. Social psychologists typically explain human behavior as a result of the r ...
literature that found a bias in human-human interaction that showed that people assign more positive evaluations to decisions made by humans than to a neutral object. The same type of positivity bias has been found for human-automation interaction, where the
automated decision Automated decision-making (ADM) involves the use of data, machines and algorithms to make decisions in a range of contexts, government by algorithm, including public administration, business, health, education, law, employment, transport, media and ...
s are rated more positively than neutral. This has become a growing problem for decision making as
intensive care units 220px, Intensive care unit An intensive care unit (ICU), also known as an intensive therapy unit or intensive treatment unit (ITU) or critical care unit (CCU), is a special department of a hospital or health care facility that provides intensiv ...
, nuclear power plants, and aircraft cockpits have increasingly integrated computerized system monitors and decision aids to mostly factor out possible human error. Errors of automation bias tend to occur when decision-making is dependent on computers or other automated aids and the human is in an observatory role but able to make decisions. Examples of automation bias range from urgent matters like flying a plane on automatic pilot to such mundane matters as the use of
spell-checking In software, a spell checker (or spelling checker or spell check) is a software feature that checks for misspellings in a text. Spell-checking features are often embedded in software or services, such as a word processor, email client, electronic d ...
programs.


Disuse and misuse

An operator's trust in the system can also lead to different interactions with the system, including system use, misuse, disuse, and abuse. The tendency toward overreliance on automated aids is known as "automation misuse". Misuse of automation can be seen when a user fails to properly monitor an automated system, or when the automated system is used when it should not be. This is in contrast to disuse, where the user does not properly utilize the automation either by turning it off or ignoring it. Both misuse and disuse can be problematic, but automation bias is directly related to misuse of the automation through either too much trust in the abilities of the system, or defaulting to using heuristics. Misuse can lead to lack of monitoring of the automated system or blind agreement with an automation suggestion, categorized by two types of errors, errors of omission and errors of commission, respectively. Automation use and disuse can also influence stages of information processing: information acquisition, information analysis, decision making and
action selection Action selection is a way of characterizing the most basic problem of intelligent systems: what to do next. In artificial intelligence and computational cognitive science, "the action selection problem" is typically associated with intelligent agen ...
, and action implementation. For example, information acquisition, the first step in information processing, is the process by which a user registers input via the senses. An automated engine gauge might assist the user with information acquisition through simple interface features—such as highlighting changes in the engine's performance—thereby directing the user's selective attention. When faced with issues originating from an aircraft, pilots may tend to overtrust an aircraft's engine gauges, losing sight of other possible malfunctions not related to the engine. This attitude is a form of automation complacency and misuse. If, however, the pilot devotes time to interpret the engine gauge, and manipulate the aircraft accordingly, only to discover that the flight turbulence has not changed, the pilot may be inclined to ignore future error recommendations conveyed by an engine gauge—a form of automation complacency leading to disuse.


Errors of commission and omission

Automation bias can take the form of commission errors, which occur when users follow an automated directive without taking into account other sources of information. Conversely, omission errors occur when automated devices fail to detect or indicate problems and the user does not notice because they are not properly monitoring the system. Errors of omission have been shown to result from cognitive vigilance decrements, while errors of commission result from a combination of a failure to take information into account and an excessive trust in the reliability of automated aids. Errors of commission occur for three reasons: (1) overt redirection of attention away from the automated aid; (2) diminished attention to the aid; (3) active discounting of information that counters the aid's recommendations. Omission errors occur when the human decision-maker fails to notice an automation failure, either due to low vigilance or overtrust in the system. For example, a spell-checking program incorrectly marking a word as misspelled and suggesting an alternative would be an error of commission, and a spell-checking program failing to notice a misspelled word would be an error of omission. In these cases, automation bias could be observed by a user accepting the alternative word without consulting a dictionary, or a user not noticing the incorrectly misspelled word and assuming all the words are correct without reviewing the words. Training that focused on the reduction of automation bias and related problems has been shown to lower the rate of commission errors, but not of omission errors.


Factors

The presence of automatic aids, as one source puts it, "diminishes the likelihood that decision makers will either make the cognitive effort to seek other diagnostic information or process all available information in cognitively complex ways." It also renders users more likely to conclude their assessment of a situation too hastily after being prompted by an automatic aid to take a specific course of action. According to one source, there are three main factors that lead to automation bias. First, the human tendency to choose the least cognitive approach to decision-making, which is called the
cognitive miser In psychology, the human mind is considered to be a cognitive miser due to the tendency of humans to think and solve problems in simpler and less effortful ways rather than in more sophisticated and effortful ways, regardless of intelligence. See ...
hypothesis. Second, the tendency of humans to view automated aids as having an analytical ability superior to their own. Third, the tendency of humans to reduce their own effort when sharing tasks, either with another person or with an automated aid. Other factors leading to an over-reliance on automation and thus to automation bias include inexperience in a task (though inexperienced users tend to be most benefited by automated decision support systems), lack of confidence in one's own abilities, a lack of readily available alternative information, or desire to save time and effort on complex tasks or high workloads. It has been shown that people who have greater confidence in their own decision-making abilities tend to be less reliant on external automated support, while those with more trust in decision support systems (DSS) were more dependent upon it.


Screen design

One study, published in the ''
Journal of the American Medical Informatics Association The ''Journal of the American Medical Informatics Association'' is a peer-reviewed scientific journal covering research in the field of medical informatics published by the American Medical Informatics Association. According to the ''Journal Ci ...
'', found that the position and prominence of advice on a screen can impact the likelihood of automation bias, with prominently displayed advice, correct or not, is more likely to be followed; another study, however, seemed to discount the importance of this factor. According to another study, a greater amount of on-screen detail can make users less "conservative" and thus increase the likelihood of automation bias. One study showed that making individuals accountable for their performance or the accuracy of their decisions reduced automation bias.


Availability

"The availability of automated decision aids," states one study by
Linda Skitka Linda J. Skitka is a professor of psychology at the University of Illinois at Chicago. Skitka's research bridges a number of areas of inquiry including social, political, and moral psychology. Publications She has authored or co-authored paper ...
, "can sometimes feed into the general human tendency to travel the road of least cognitive effort."


Awareness of process

One study also found that when users are made aware of the reasoning process employed by a decision support system, they are likely to adjust their reliance accordingly, thus reducing automation bias.


Team vs. individual

The performance of jobs by crews instead of individuals acting alone does not necessarily eliminate automation bias. One study has shown that when automated devices failed to detect system irregularities, teams were no more successful than solo performers at responding to those irregularities.


Training

Training that focuses on automation bias in
aviation Aviation includes the activities surrounding mechanical flight and the aircraft industry. ''Aircraft'' includes fixed-wing and rotary-wing types, morphable wings, wing-less lifting bodies, as well as lighter-than-air craft such as hot air ...
has succeeded in reducing omission errors by student pilots.


Automation failure and "learned carelessness"

It has been shown that automation failure is followed by a drop in operator trust, which in turn is succeeded by a slow recovery of trust. The decline in trust after an initial automation failure has been described as the first-failure effect. By the same token, if automated aids prove to be highly reliable over time, the result is likely to be a heightened level of automation bias. This is called "learned carelessness."


Provision of system confidence information

In cases where system confidence information is provided to users, that information itself can become a factor in automation bias.


External pressures

Studies have shown that the more external pressures are exerted on an individual's cognitive capacity, the more he or she may rely on external support.


Definitional problems

Although automation bias has been the subject of many studies, there continue to be complaints that automation bias remains ill-defined and that reporting of incidents involving automation bias is unsystematic. A review of various automation bias studies categorized the different types of tasks where automated aids were used as well as what function the automated aids served. Tasks where automated aids were used were categorized as monitoring tasks, diagnosis tasks, or treatment tasks. Types of automated assistance were listed as Alerting automation, which track important changes and alert the user, Decision support automation, which may provide a diagnosis or recommendation, or Implementation automation, where the automated aid performs a specified task.


Automation-induced complacency

The concept of automation bias is viewed as overlapping with automation-induced complacency, also known more simply as automation complacency. Like automation bias, it is a consequence of the misuse of automation and involves problems of attention. While automation bias involves a tendency to trust decision-support systems, automation complacency involves insufficient attention to and monitoring of automation output, usually because that output is viewed as reliable. "Although the concepts of complacency and automation bias have been discussed separately as if they were independent," writes one expert, "they share several commonalities, suggesting they reflect different aspects of the same kind of automation misuse." It has been proposed, indeed, that the concepts of complacency and automation bias be combined into a single "integrative concept" because these two concepts "might represent different manifestations of overlapping automation-induced phenomena" and because "automation-induced complacency and automation bias represent closely linked theoretical concepts that show considerable overlap with respect to the underlying processes." Automation complacency has been defined as "poorer detection of system malfunctions under automation compared with under manual control." NASA's
Aviation Safety Reporting System The Aviation Safety Reporting System, or ASRS, is the United States of America, US Federal Aviation Administration's (FAA) voluntary confidential reporting system that allows pilots, air traffic controllers, cabin crew, dispatchers, maintenance te ...
(ASRS) defines complacency as "self-satisfaction that may result in non-vigilance based on an unjustified assumption of satisfactory system state." Several studies have indicated that it occurs most often when operators are engaged in both manual and automated tasks at the same time. In turn, the operators' perceptions of the automated system's reliability can influence the way in which the operator interacts with the system. Endsley (2017) describes how high system reliability can lead users to disengage from monitoring systems, thereby increasing monitoring errors, decreasing situational awareness, and interfering with an operator's ability to re-assume control of the system in the event performance limitations have been exceeded. This complacency can be sharply reduced when automation reliability varies over time instead of remaining constant, but is not reduced by experience and practice. Both expert and inexpert participants can exhibit automation bias as well as automation complacency. Neither of these problems can be easily overcome by training. The term "automation complacency" was first used in connection with aviation accidents or incidents in which
pilots An aircraft pilot or aviator is a person who controls the flight of an aircraft by operating its directional flight controls. Some other aircrew members, such as navigators or flight engineers, are also considered aviators, because they a ...
, air-traffic controllers, or other workers failed to check systems sufficiently, assuming that everything was fine when, in reality, an accident was about to occur. Operator complacency, whether or not automation-related, has long been recognized as a leading factor in air accidents. As such, perceptions of reliability, in general, can result in a form of automation irony, in which more automation can decrease cognitive workload but increase the opportunity for monitoring errors. In contrast, low automation can increase workload but decrease the opportunity for monitoring errors. Take, for example, a pilot flying through inclement weather, in which continuous thunder interferes with the pilot's ability to understand information transmitted by an air traffic controller (ATC). Despite how much effort is allocated to understanding information transmitted by ATC, the pilot's performance is limited by the source of information needed for the task. The pilot therefore has to rely on automated gauges in the cockpit to understand flight path information. If the pilot perceives the automated gauges to be highly reliable, the amount of effort needed to understand ATC and automated gauges may decrease. Moreover, if the automated gauges are perceived to be highly reliable, the pilot may ignore those gauges to devote mental resources for deciphering information transmitted by ATC. In so doing, the pilot becomes a complacent monitor, thereby running the risk of missing critical information conveyed by the automated gauges. If, however, the pilot perceives the automated gauges to be unreliable, the pilot will now have to interpret information from ATC and automated gauges simultaneously. This creates scenarios in which the operator may be expending unnecessary cognitive resources when the automation is in fact reliable, but also increasing the odds of identifying potential errors in the weather gauges should they occur. To calibrate the pilot's perception of reliability, automation should be designed to maintain workload at appropriate levels while also ensuring the operator remains engaged with monitoring tasks. The operator should be less likely to disengage from monitoring when the system's reliability can change as compared to a system that has consistent reliability (Parasuraman, 1993). To some degree, user complacency offsets the benefits of automation, and when an automated system's reliability level falls below a certain level, then automation will no longer be a net asset. One 2007 study suggested that this automation occurs when the reliability level reaches approximately 70%. Other studies have found that automation with a reliability level below 70% can be of use to persons with access to the raw information sources, which can be combined with the automation output to improve performance. Death by GPS, wherein the deaths of individuals is in part caused by following inaccurate GPS directions, is another example of automation complacency.


Sectors

Automation bias has been examined across many research fields. It can be a particularly major concern in aviation,
medicine Medicine is the science and practice of caring for a patient, managing the diagnosis, prognosis, prevention, treatment, palliation of their injury or disease, and promoting their health. Medicine encompasses a variety of health care pract ...
,
process control An industrial process control in continuous production processes is a discipline that uses industrial control systems to achieve a production level of consistency, economy and safety which could not be achieved purely by human manual control. I ...
, and
military A military, also known collectively as armed forces, is a heavily armed, highly organized force primarily intended for warfare. It is typically authorized and maintained by a sovereign state, with its members identifiable by their distinct ...
command-and-control operations.


Aviation

At first, discussion of automation bias focused largely on aviation. Automated aids have played an increasing role in cockpits, taking a growing role in the control of such flight tasks as determining the most fuel-efficient routes, navigating, and detecting and diagnosing system malfunctions. The use of these aids, however, can lead to less attentive and less vigilant information seeking and processing on the part of human beings. In some cases, human beings may place more confidence in the misinformation provided by flight computers than in their own skills. An important factor in aviation-related automation bias is the degree to which pilots perceive themselves as responsible for the tasks being carried out by automated aids. One study of pilots showed that the presence of a second crewmember in the cockpit did not affect automation bias. A 1994 study compared the impact of low and high levels of automation (LOA) on pilot performance, and concluded that pilots working with a high LOA spent less time reflecting independently on flight decisions. In another study, all of the pilots given false automated alerts that instructed them to shut off an engine did so, even though those same pilots insisted in an interview that they would not respond to such an alert by shutting down an engine, and would instead have reduced the power to idle. One 1998 study found that pilots with approximately 440 hours of flight experience detected more automation failures than did nonpilots, although both groups showed complacency effects. A 2001 study of pilots using a cockpit automation system, the
Engine-indicating and crew-alerting system An engine-indicating and crew-alerting system (EICAS) is an integrated system used in modern aircraft to provide aircraft flight crew with instrumentation and crew annunciations for aircraft engines and other systems. On EICAS equipped aircraft th ...
(EICAS), showed evidence of complacency. The pilots detected fewer engine malfunctions when using the system than when performing the task manually. In a 2005 study, experienced air-traffic controllers used high-fidelity simulation of an ATC (Free Flight) scenario that involved the detection of conflicts among "self-separating" aircraft. They had access to an automated device that identified potential conflicts several minutes ahead of time. When the device failed near the end of the simulation process, considerably fewer controllers detected the conflict than when the situation was handled manually. Other studies have produced similar findings. Two studies of automation bias in aviation discovered a higher rate of commission errors than omission errors, while another aviation study found 55% omission rates and 0% commission rates. Automation-related omissions errors are especially common during the cruise phase. When a
China Airlines China Airlines (CAL; ) is the state-owned flag carrier of the Republic of China (Taiwan), and one of its two major airlines along with EVA Air. It is headquartered in Taoyuan International Airport and operates over 1,400 flights weekly (inclu ...
flight lost power in one engine, the autopilot attempted to correct for this problem by lowering the left wing, an action that hid the problem from the crew. When the autopilot was disengaged, the airplane rolled to the right and descended steeply, causing extensive damage. The 1983 shooting down of a Korean Airlines 747 over
Soviet The Soviet Union,. officially the Union of Soviet Socialist Republics. (USSR),. was a List of former transcontinental countries#Since 1700, transcontinental country that spanned much of Eurasia from 1922 to 1991. A flagship communist state, ...
airspace occurred because the Korean crew "relied on automation that had been inappropriately set up, and they never checked their progress manually."


Health care

Clinical decision support system A clinical decision support system (CDSS) is a health information technology, provides clinicians, staff, patients, or other individuals with knowledge and person-specific information, to help health and health care. CDSS encompasses a variety of ...
s (CDSS) are designed to aid clinical decision-making. They have the potential to effect a great improvement in this regard, and to result in improved patient outcomes. Yet while CDSS, when used properly, bring about an overall improvement in performance, they also cause errors that may not be recognized owing to automation bias. One danger is that the incorrect advice given by these systems may cause users to change a correct decision that they have made on their own. Given the highly serious nature of some of the potential consequences of AB in the health-care field, it is especially important to be aware of this problem when it occurs in clinical settings. Sometimes automation bias in clinical settings is a major problem that renders CDSS, on balance, counterproductive; sometimes it is minor problem, with the benefits outweighing the damage done. One study found more automation bias among older users, but it was noted that could be a result not of age but of experience. Studies suggest, indeed, that familiarity with CDSS often leads to desensitization and habituation effects. Although automation bias occurs more often among persons who are inexperienced in a given task, inexperienced users exhibit the most performance improvement when they use CDSS. In one study, the use of CDSS improved clinicians' answers by 21%, from 29% to 50%, with 7% of correct non-CDSS answers being changed incorrectly. A 2005 study found that when primary-care physicians used electronic sources such as
PubMed PubMed is a free search engine accessing primarily the MEDLINE database of references and abstracts on life sciences and biomedical topics. The United States National Library of Medicine (NLM) at the National Institutes of Health maintain the ...
,
Medline MEDLINE (Medical Literature Analysis and Retrieval System Online, or MEDLARS Online) is a bibliographic database of life sciences and biomedical information. It includes bibliographic information for articles from academic journals covering medic ...
, and
Google Google LLC () is an American multinational technology company focusing on search engine technology, online advertising, cloud computing, computer software, quantum computing, e-commerce, artificial intelligence, and consumer electronics. ...
, there was a "small to medium" increase in correct answers, while in an equally small percentage of instances the physicians were misled by their use of those sources, and changed correct to incorrect answers. Studies in 2004 and 2008 that involved the effect of automated aids on diagnosis of
breast cancer Breast cancer is cancer that develops from breast tissue. Signs of breast cancer may include a lump in the breast, a change in breast shape, dimpling of the skin, milk rejection, fluid coming from the nipple, a newly inverted nipple, or a re ...
found clear evidence of automation bias involving omission errors. Cancers diagnosed in 46% of cases without automated aids were discovered in only 21% of cases with automated aids that failed to identify the cancer.


Military

Automation bias can be a crucial factor in the use of intelligent decision support systems for military command-and-control operations. One 2004 study found that automation bias effects have contributed to a number of fatal military decisions, including friendly-fire killings during the
Iraq War {{Infobox military conflict , conflict = Iraq War {{Nobold, {{lang, ar, حرب العراق (Arabic) {{Nobold, {{lang, ku, شەڕی عێراق (Kurdish languages, Kurdish) , partof = the Iraq conflict (2003–present), I ...
. Researchers have sought to determine the proper LOA for decision support systems in this field.


Automotive

Automation complacency is also a challenge for automated driving systems in which the human only has to monitor the system or act as a fallback driver. This is for example discussed in the report of
National Transportation Safety Board The National Transportation Safety Board (NTSB) is an independent U.S. government investigative agency responsible for civil transportation accident investigation. In this role, the NTSB investigates and reports on aviation accidents and incid ...
about the fatal accident between an UBER test vehicle and pedestrian
Elaine Herzberg The death of Elaine Herzberg (August 2, 1968 – March 18, 2018) was the first recorded case of a pedestrian fatality involving a self-driving car, after a collision that occurred late in the evening of March 18, 2018. Herzberg was pushing a bic ...
.


Correction

Automation bias can be mitigated by redesigning automated systems to reduce display prominence, decrease information complexity or couch assistance as supportive rather than directive information. Training users on automated systems which introduce deliberate errors more effectively reduces automation bias than just telling them errors can occur. Excessively checking and questioning automated assistance can increase time pressure and task complexity, thus reducing benefit, so some automated decision support systems balance positive and negative effects rather than attempt to eliminate negative effects.


See also

*
Algorithmic bias Algorithmic bias describes systematic and repeatable errors in a computer system that create " unfair" outcomes, such as "privileging" one category over another in ways different from the intended function of the algorithm. Bias can emerge from ...
*
Automation Automation describes a wide range of technologies that reduce human intervention in processes, namely by predetermining decision criteria, subprocess relationships, and related actions, as well as embodying those predeterminations in machines ...
*
List of cognitive biases Cognitive biases are systematic patterns of deviation from norm and/or rationality in judgment. They are often studied in psychology, sociology and behavioral economics. Although the reality of most of these biases is confirmed by reproducible ...
* Out-of-the-loop performance problem


References


Further reading

*


External links


''175 Reasons Why You Don't Think Clearly''
{{Biases Cognitive biases Impact of Automation Prospect theory