Human Reliability Analysis
   HOME

TheInfoList



OR:

Human reliability (also known as human performance or HU) is related to the field of
human factors and ergonomics Human factors and ergonomics (commonly referred to as human factors) is the application of psychological and physiological principles to the engineering and design of products, processes, and systems. Four primary goals of human factors learnin ...
, and refers to the
reliability Reliability, reliable, or unreliable may refer to: Science, technology, and mathematics Computing * Data reliability (disambiguation), a property of some disk arrays in computer storage * High availability * Reliability (computer networking), a ...
of
human Humans (''Homo sapiens'') are the most abundant and widespread species of primate, characterized by bipedalism and exceptional cognitive skills due to a large and complex brain. This has enabled the development of advanced tools, culture, ...
s in fields including
manufacturing Manufacturing is the creation or production of goods with the help of equipment, labor, machines, tools, and chemical or biological processing or formulation. It is the essence of secondary sector of the economy. The term may refer to a r ...
,
medicine Medicine is the science and practice of caring for a patient, managing the diagnosis, prognosis, prevention, treatment, palliation of their injury or disease, and promoting their health. Medicine encompasses a variety of health care pract ...
and
nuclear power Nuclear power is the use of nuclear reactions to produce electricity. Nuclear power can be obtained from nuclear fission, nuclear decay and nuclear fusion reactions. Presently, the vast majority of electricity from nuclear power is produced b ...
. Human performance can be affected by many factors such as
age Age or AGE may refer to: Time and its effects * Age, the amount of time someone or something has been alive or has existed ** East Asian age reckoning, an Asian system of marking age starting at 1 * Ageing or aging, the process of becoming older ...
, state of mind, physical
health Health, according to the World Health Organization, is "a state of complete physical, mental and social well-being and not merely the absence of disease and infirmity".World Health Organization. (2006)''Constitution of the World Health Organiza ...
,
attitude Attitude may refer to: Philosophy and psychology * Attitude (psychology), an individual's predisposed state of mind regarding a value * Metaphysics of presence * Propositional attitude, a relational mental state connecting a person to a pro ...
,
emotion Emotions are mental states brought on by neurophysiological changes, variously associated with thoughts, feelings, behavioral responses, and a degree of pleasure or displeasure. There is currently no scientific consensus on a definition. ...
s, propensity for certain common mistakes,
error An error (from the Latin ''error'', meaning "wandering") is an action which is inaccurate or incorrect. In some usages, an error is synonymous with a mistake. The etymology derives from the Latin term 'errare', meaning 'to stray'. In statistics ...
s and
cognitive bias A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, m ...
es, etc. Human reliability is very important due to the contributions of humans to the resilience of systems and to possible adverse consequences of
human error Human error refers to something having been done that was " not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits".Senders, J.W. and Moray, N.P. (1991) Human ...
s or oversights, especially when the human is a crucial part of the large
socio-technical systems Sociotechnical systems (STS) in organizational development is an approach to complex organizational work design that recognizes the interaction between people and technology in workplaces. The term also refer to coherent systems of human relation ...
as is common today.
User-centered design User-centered design (UCD) or user-driven development (UDD) is a framework of process (not restricted to interfaces or technologies) in which usability goals, user characteristics, environment, tasks and workflow of a product, service or proce ...
and
error-tolerant design An error-tolerant design (also: human-error-tolerant design) is one that does not unduly penalize user or human errors. It is the human equivalent of fault tolerant design that allows equipment to continue functioning in the presence of hardware ...
are just two of many terms used to describe efforts to make
technology Technology is the application of knowledge to reach practical goals in a specifiable and reproducible way. The word ''technology'' may also mean the product of such an endeavor. The use of technology is widely prevalent in medicine, science, ...
better suited to operation by humans.


Common Traps of Human Nature

People tend to overestimate their ability to maintain control when they are doing work. The common characteristics of human nature addressed below are especially accentuated when work is performed in a complex work environment. Stress The problem with stress is that it can accumulate and overpower a person, thus becoming detrimental to performance. Avoidance of Mental Strain Humans are reluctant to engage in lengthy concentrated thinking, as it requires high levels of attention for extended periods. The mental biases, or shortcuts, often used to reduce mental effort and expedite decision-making include: * Assumptions – A condition taken for granted or accepted as true without verification of the facts. * Habit – An unconscious pattern of behavior acquired through frequent repetition. * Confirmation bias – The reluctance to abandon a current solution. * Similarity bias – The tendency to recall solutions from situations that appear similar * Frequency bias – A gamble that a frequently used solution will work. * Availability bias – The tendency to settle on solutions or courses of action that readily come to mind. Limited Working Memory - The mind's short-term memory is the “workbench” for problem solving and decision-making. Limited Attention Resources - The limited ability to concentrate on two or more activities challenges the ability to process information needed to solve problems. Mind-Set People tend to focus more on what they want to accomplish (a goal) and less on what needs to be avoided because human beings are primarily goal-oriented by nature. As such, people tend to “see” only what the mind expects, or wants, to see. Difficulty Seeing One's Own Error - Individuals, especially when working alone, are particularly susceptible to missing errors. Limited Perspective - Humans cannot see all there is to see. The inability of the human mind to perceive all facts pertinent to a decision challenges problem-solving. Susceptibility To Emotional/Social Factors - Anger and embarrassment adversely influence team and individual performance. Fatigue - People get tired. Physical, emotional, and mental fatigue can lead to error and poor judgment. Presenteeism - Some employees will be present in the need to belong to the workplace despite a diminished capacity to perform their jobs due to illness or injury.


Analysis techniques

A variety of methods exist for human reliability analysis (HRA). Two general classes of methods are those based on
probabilistic risk assessment Probabilistic risk assessment (PRA) is a systematic and comprehensive methodology to evaluate risks associated with a complex engineered technological entity (such as an airliner or a nuclear power plant) or the effects of stressors on the environm ...
(PRA) and those based on a
cognitive Cognition refers to "the mental action or process of acquiring knowledge and understanding through thought, experience, and the senses". It encompasses all aspects of intellectual functions and processes such as: perception, attention, thought, ...
theory of
control Control may refer to: Basic meanings Economics and business * Control (management), an element of management * Control, an element of management accounting * Comptroller (or controller), a senior financial officer in an organization * Controlling ...
.


PRA-based techniques

One method for analyzing human reliability is a straightforward extension of
probabilistic risk assessment Probabilistic risk assessment (PRA) is a systematic and comprehensive methodology to evaluate risks associated with a complex engineered technological entity (such as an airliner or a nuclear power plant) or the effects of stressors on the environm ...
(PRA): in the same way that equipment can fail in a power plant, so can a human operator commit errors. In both cases, an analysis (
functional decomposition In mathematics, functional decomposition is the process of resolving a functional relationship into its constituent parts in such a way that the original function can be reconstructed (i.e., recomposed) from those parts by function composition. ...
for equipment and
task analysis Task analysis is the analysis of how a task is accomplished, including a detailed description of both manual and mental activities, task and element durations, task frequency, task allocation, task complexity, environmental conditions, necessary cl ...
for humans) would articulate a level of detail for which failure or error probabilities can be assigned. This basic idea is behind the
Technique for Human Error Rate Prediction The technique for human error-rate prediction (THERP) is a technique used in the field of human reliability assessment (HRA), for the purposes of evaluating the probability of a human error occurring throughout the completion of a specific task. Fr ...
(THERP). THERP is intended to generate human error probabilities that would be incorporated into a PRA. The Accident Sequence Evaluation Program (ASEP) human reliability procedure is a simplified form of THERP; an associated computational tool is Simplified Human Error Analysis Code (SHEAN). More recently, the US Nuclear Regulatory Commission has published the Standardized Plant Analysis Risk - Human Reliability Analysis (SPAR-H) method to take account of the potential for human error.


Cognitive control based techniques

Erik Hollnagel has developed this line of thought in his work on the Contextual Control Model (COCOM) and the Cognitive Reliability and Error Analysis Method (CREAM). COCOM models human performance as a set of control modes—strategic (based on long-term planning), tactical (based on procedures), opportunistic (based on present context), and scrambled (random) - and proposes a model of how transitions between these control modes occur. This model of control mode transition consists of a number of factors, including the human operator's estimate of the outcome of the action (success or failure), the time remaining to accomplish the action (adequate or inadequate), and the number of simultaneous goals of the human operator at that time. CREAM is a human reliability analysis method that is based on COCOM.


Related techniques

Related techniques in
safety engineering Safety engineering is an engineering discipline which assures that engineered systems provide acceptable levels of safety. It is strongly related to industrial engineering/systems engineering, and the subset system safety engineering. Safety en ...
and
reliability engineering Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specifie ...
include
failure mode and effects analysis Failure mode and effects analysis (FMEA; often written with "failure modes" in plural) is the process of reviewing as many components, assemblies, and subsystems as possible to identify potential failure modes in a system and their causes and effe ...
,
hazop A hazard and operability study (HAZOP) is a structured and systematic examination of a complex plan or operation in order to identify and evaluate problems that may represent risks to personnel or equipment. The intention of performing a HAZOP is to ...
,
fault tree Fault tree analysis (FTA) is a type of failure analysis in which an undesired state of a system is examined. This analysis method is mainly used in safety engineering and reliability engineering to understand how systems can fail, to identify t ...
, and
SAPHIRE {{primary sources, date=March 2015 SAPHIRE is a probabilistic risk and reliability assessment software tool. SAPHIRE stands for ''Systems Analysis Programs for Hands-on Integrated Reliability Evaluations''. The system was developed for the U.S. N ...
(Systems Analysis Programs for Hands-on Integrated Reliability Evaluations).


Human Factors Analysis and Classification System (HFACS)

The Human Factors Analysis and Classification System (HFACS) was developed initially as a framework to understand the role of "human error" in aviation accidents.Wiegmann and Shappell, 2003 It is based on James Reason's
Swiss cheese model The Swiss cheese model of accident causation is a model used in risk analysis and risk management, including aviation safety, engineering, healthcare, emergency service organizations, and as the principle behind layered security, as used in com ...
of human error in complex systems. HFACS distinguishes between the "active failures" of unsafe acts, and "latent failures" of preconditions for unsafe acts, unsafe supervision, and organizational influences. These categories were developed empirically on the basis of many aviation accident reports. "Unsafe acts" are performed by the human operator "on the front line" (e.g., the pilot, the air traffic controller, the driver). Unsafe acts can be either errors (in perception, decision making or skill-based performance) or violations (routine or exceptional). The errors here are similar to the above discussion. Violations are the deliberate disregard for rules and procedures. As the name implies, routine violations are those that occur habitually and are usually tolerated by the organization or authority. Exceptional violations are unusual and often extreme. For example, driving 60 mph in a 55-mph zone speed limit is a routine violation, but driving 130 mph in the same zone is exceptional. There are two types of preconditions for unsafe acts: those that relate to the human operator's internal state and those that relate to the human operator's practices or ways of working. Adverse internal states include those related to physiology (e.g., illness) and mental state (e.g., mentally fatigued, distracted). A third aspect of 'internal state' is really a mismatch between the operator's ability and the task demands; for example, the operator may be unable to make visual judgments or react quickly enough to support the task at hand. Poor operator practices are another type of precondition for unsafe acts. These include poor crew resource management (issues such as leadership and communication) and poor personal readiness practices (e.g., violating the crew rest requirements in aviation). Four types of unsafe supervision are: inadequate supervision; planned inappropriate operations; failure to correct a known problem; and supervisory violations. Organizational influences include those related to resources management (e.g., inadequate human or financial resources), organizational climate (structures, policies, and culture), and organizational processes (such as procedures, schedules, oversight).


See also

* * (A Technique for Human Event Analysis) * , a technique used in the field of human reliability * * * * (Tecnica Empirica Stima Errori Operatori)


Footnotes


References

*

* * * * * * * * * * * * *

* * *

* *Federal Aviation Administration. 2009 electronic code of regulations. Retrieved September 25, 2009, from https://web.archive.org/web/20120206214308/http://www.airweb.faa.gov/Regulatory_and_Guidance_library/rgMakeModel.nsf/0/5a9adccea6c0c4e286256d3900494a77/$FILE/H3WE.pdf


Further reading

* * * * * *

* * *

* * * * * * * * * * * * * * * *CCPS, Guidelines for Preventing Human Error. This book explains about qualitative and quantitative methodology for predicting human error. Qualitative methodology called SPEAR: Systems for Predicting Human Error and Recovery, and quantitative methodology also includes THERP, etc.


External links


Standards and guidance documents


IEEE Standard 1082 (1997): IEEE Guide for Incorporating Human Action Reliability Analysis for Nuclear Power Generating StationsDOE Standard DOE-HDBK-1028-2009 : Human Performance Improvement Handbook


Tools


EPRI HRA CalculatorRiskSpectrum HRA softwareSimplified Human Error Analysis Code


Research labs


Erik Hollnagel
at th
Crisis and Risk Research Centre
a
MINES ParisTech
at the US
Sandia National Laboratories Sandia National Laboratories (SNL), also known as Sandia, is one of three research and development laboratories of the United States Department of Energy's National Nuclear Security Administration (NNSA). Headquartered in Kirtland Air Force Ba ...

Center for Human Reliability Studies
at the US
Oak Ridge National Laboratory Oak Ridge National Laboratory (ORNL) is a U.S. multiprogram science and technology national laboratory sponsored by the U.S. Department of Energy (DOE) and administered, managed, and operated by UT–Battelle as a federally funded research and ...

Flight Cognition Laboratory
at
NASA Ames Research Center The Ames Research Center (ARC), also known as NASA Ames, is a major NASA research center at Moffett Federal Airfield in California's Silicon Valley. It was founded in 1939 as the second National Advisory Committee for Aeronautics (NACA) laborat ...

David Woods
at th
Cognitive Systems Engineering Laboratory
at The
Ohio State University The Ohio State University, commonly called Ohio State or OSU, is a public land-grant research university in Columbus, Ohio. A member of the University System of Ohio, it has been ranked by major institutional rankings among the best publ ...

Sidney Dekker's Leonardo da Vinci Laboratory for Complexity and Systems Thinking, Lund University, Sweden


Media coverage


“How to Avoid Human Error in IT“

“Human Reliability. We break down just like machines“
Industrial Engineer - November 2004, 36(11): 66


Networking


High Reliability Management group at LinkedIn.com
{{DEFAULTSORT:Human Reliability