Human reliability analysis: a critique and review for managers

S. French, T. Bedford, S. J. T. Pollard, E. Soane

Research output: Contribution to journalArticle

56 Citations (Scopus)

Abstract

In running our increasingly complex business systems, formal risk analyses and risk management techniques are becoming more important part to managers: all managers, not just those charged with risk management. It is also becoming apparent that human behaviour is often a root or significant contributing cause of system failure. This latter observation is not novel; for more than 30 years it has been recognised that the role of human operations in safety critical systems is so important that they should be explicitly modelled as part of the risk assessment of plant operations. This has led to the development of a range of methods under the general heading of human reliability analysis (HRA) to account for the effects of human error in risk and reliability analysis. The modelling approaches used in HRA, however, tend to be focussed on easily describable sequential, generally low-level tasks, which are not the main source of systemic errors. Moreover, they focus on errors rather than the effects of all forms of human behaviour. In this paper we review and discuss HRA methodologies, arguing that there is a need for considerable further research and development before they meet the needs of modern risk and reliability analyses and are able to provide managers with the guidance they need to manage complex systems safely. We provide some suggestions for how work in this area should develop. But above all we seek to make the management community fully aware of assumptions implicit in human reliability analysis and its limitations.
LanguageEnglish
Pages753-763
Number of pages11
JournalSafety Science
Volume49
Issue number6
DOIs
Publication statusPublished - Jul 2011

Fingerprint

Reliability analysis
Managers
manager
Risk management
risk management
Risk Management
Risk analysis
human error
Risk assessment
Large scale systems
risk assessment
research and development
Running
Research Design
cause
Observation
methodology
Industry
Safety
management

Keywords

  • safety
  • risk
  • human reliability analysis (HRA)
  • automaticity
  • shared mental models
  • performance
  • high reliability organisations
  • complex
  • rationality
  • management of risk
  • feelings
  • cognition
  • cynefin model of decision contexts
  • organizational culture

Cite this

French, S. ; Bedford, T. ; Pollard, S. J. T. ; Soane, E. / Human reliability analysis : a critique and review for managers. In: Safety Science. 2011 ; Vol. 49, No. 6. pp. 753-763.
@article{da3e375f4c5e4243bb3dab9e47277f75,
title = "Human reliability analysis: a critique and review for managers",
abstract = "In running our increasingly complex business systems, formal risk analyses and risk management techniques are becoming more important part to managers: all managers, not just those charged with risk management. It is also becoming apparent that human behaviour is often a root or significant contributing cause of system failure. This latter observation is not novel; for more than 30 years it has been recognised that the role of human operations in safety critical systems is so important that they should be explicitly modelled as part of the risk assessment of plant operations. This has led to the development of a range of methods under the general heading of human reliability analysis (HRA) to account for the effects of human error in risk and reliability analysis. The modelling approaches used in HRA, however, tend to be focussed on easily describable sequential, generally low-level tasks, which are not the main source of systemic errors. Moreover, they focus on errors rather than the effects of all forms of human behaviour. In this paper we review and discuss HRA methodologies, arguing that there is a need for considerable further research and development before they meet the needs of modern risk and reliability analyses and are able to provide managers with the guidance they need to manage complex systems safely. We provide some suggestions for how work in this area should develop. But above all we seek to make the management community fully aware of assumptions implicit in human reliability analysis and its limitations.",
keywords = "safety, risk, human reliability analysis (HRA), automaticity, shared mental models, performance, high reliability organisations, complex, rationality, management of risk, feelings, cognition, cynefin model of decision contexts, organizational culture",
author = "S. French and T. Bedford and Pollard, {S. J. T.} and E. Soane",
year = "2011",
month = "7",
doi = "10.1016/j.ssci.2011.02.008",
language = "English",
volume = "49",
pages = "753--763",
journal = "Safety Science",
issn = "0925-7535",
number = "6",

}

Human reliability analysis : a critique and review for managers. / French, S.; Bedford, T.; Pollard, S. J. T.; Soane, E.

In: Safety Science, Vol. 49, No. 6, 07.2011, p. 753-763.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Human reliability analysis

T2 - Safety Science

AU - French, S.

AU - Bedford, T.

AU - Pollard, S. J. T.

AU - Soane, E.

PY - 2011/7

Y1 - 2011/7

N2 - In running our increasingly complex business systems, formal risk analyses and risk management techniques are becoming more important part to managers: all managers, not just those charged with risk management. It is also becoming apparent that human behaviour is often a root or significant contributing cause of system failure. This latter observation is not novel; for more than 30 years it has been recognised that the role of human operations in safety critical systems is so important that they should be explicitly modelled as part of the risk assessment of plant operations. This has led to the development of a range of methods under the general heading of human reliability analysis (HRA) to account for the effects of human error in risk and reliability analysis. The modelling approaches used in HRA, however, tend to be focussed on easily describable sequential, generally low-level tasks, which are not the main source of systemic errors. Moreover, they focus on errors rather than the effects of all forms of human behaviour. In this paper we review and discuss HRA methodologies, arguing that there is a need for considerable further research and development before they meet the needs of modern risk and reliability analyses and are able to provide managers with the guidance they need to manage complex systems safely. We provide some suggestions for how work in this area should develop. But above all we seek to make the management community fully aware of assumptions implicit in human reliability analysis and its limitations.

AB - In running our increasingly complex business systems, formal risk analyses and risk management techniques are becoming more important part to managers: all managers, not just those charged with risk management. It is also becoming apparent that human behaviour is often a root or significant contributing cause of system failure. This latter observation is not novel; for more than 30 years it has been recognised that the role of human operations in safety critical systems is so important that they should be explicitly modelled as part of the risk assessment of plant operations. This has led to the development of a range of methods under the general heading of human reliability analysis (HRA) to account for the effects of human error in risk and reliability analysis. The modelling approaches used in HRA, however, tend to be focussed on easily describable sequential, generally low-level tasks, which are not the main source of systemic errors. Moreover, they focus on errors rather than the effects of all forms of human behaviour. In this paper we review and discuss HRA methodologies, arguing that there is a need for considerable further research and development before they meet the needs of modern risk and reliability analyses and are able to provide managers with the guidance they need to manage complex systems safely. We provide some suggestions for how work in this area should develop. But above all we seek to make the management community fully aware of assumptions implicit in human reliability analysis and its limitations.

KW - safety

KW - risk

KW - human reliability analysis (HRA)

KW - automaticity

KW - shared mental models

KW - performance

KW - high reliability organisations

KW - complex

KW - rationality

KW - management of risk

KW - feelings

KW - cognition

KW - cynefin model of decision contexts

KW - organizational culture

UR - http://www.scopus.com/inward/record.url?scp=79955666782&partnerID=8YFLogxK

U2 - 10.1016/j.ssci.2011.02.008

DO - 10.1016/j.ssci.2011.02.008

M3 - Article

VL - 49

SP - 753

EP - 763

JO - Safety Science

JF - Safety Science

SN - 0925-7535

IS - 6

ER -