Author Bio ▼

Dominic Cooper PhD is an independent researcher who has authored many books, articles and scientific research papers on safety culture, behavioural-safety and leadership.
January 12, 2023

Get the SHP newsletter

Daily health and safety news, job alerts and resources

Dominic Cooper: To err is human, or is it?

Credit: RTimages / Alamy Stock Photo

Dominic Cooper on whether incident investigations are caused solely by the system or person.

Hands up if you did not follow a rule, forgot something, made a mistake or took a short-cut in the past 24 hours. Did you stop to consider why you had done so? If you did, what was your conclusion? You [a] did not have enough information to do a task properly (failure in planning)? [b] just forgot or became distracted (failure in execution)? [c] tried to save time (made a behavioural choice)? Each explanation points to a category of human error.

Given most behaviours are goal-directed (e.g. make a cup of coffee, travel from A to B, fix a machine), human error is defined as “the failure of a planned action to achieve its goal”. To be termed human error, there must have been a plan, intention or goal, spawning a sequence of actions or behaviours, that somehow failed to achieve the original objective. Some believe errors are the result of a person’s failure to do things correctly, while others state error is always caused by other external factors, not the person.

Eliminate

Currently, it seems, many OSH practitioners want to eliminate accountability and culpability in incident investigations. They assert that all errors and behaviours are entirely system induced. i.e. our work or task systems always ‘compel’ us to do what we do: a person can never be at fault. This is largely due to the influence of Jens Rasmussen, who in the early 1980s stated human error in the workplace should be considered “as man-task mismatches or misfits”. For example, a row of buttons on a control panel that all look alike but each having a different function, can create confusion during a time of stress resulting in an incorrect button press with significant adverse impact. Rasmussen asking why machine interfaces or systems were designed or configured in such ways, asserted that people were an integral system component. He postulated all human errors are caused by external forces that reside in, or impact, the system. These forces included legislation, regulation, the company, its management, staff, the type of work and task design. Thus, he, and others in this cognitive engineering tradition, such as Erik Hollnagel, concluded that errors are consequences rather than causes, therefore, people should not be blamed or held to account, as errors are forced on them entirely by some aspect of the system. Ultimately this led to the socio-technical model of incident analyses, whereby causation is determined by looking outwards from the person to the system(s), the environment and beyond. Typically, this would mean looking at the [system] boundaries [1] of what would be considered an acceptable state of affairs within a [task] system; [2] the individual’s resource profile (competence, skills, expertise); and [3] the available means of work (resources).

Pathogen Model

James Reason, also exploring human error at the same time, released his famous Pathogen Model of incident causation in 1990 (aka Swiss Cheese), which looked inwards to events from the outside (i.e. opposite to the Cognitive Engineering approach). Reason agreed with Rasmussen there were layers of system influence on errors and behaviour, but reduced these to only those within the company, while accepting external influences (e.g. regulators) can impact internal decision-making. The key difference was Reason highlighted how faults in each system layer interacted with the next to produce a holistic view of an incidents trajectory. John Wreathall from the US Nuclear Industry, added to Reason’s model by specifying the sources of system faults at each layer as [1] senior management; [2] line-management; [3] operations support functions (HR, OSH, Engineering, etc.,); [4] people’s on-the job behaviours; and [5] defences (lack of, or ineffective, barriers). Reason asserted unintentional knowledge and rule-based mistakes (Failures in Planning) created latent system faults that lay dormant in the organisation, until triggered by active failures such as unintentional slips and lapses (Failures in Execution) or intentional behavioural choices (e.g. [1] taking short-cuts; [2] ad-hoc optimisation of the way something is done; [3] over-coming organisational problems (such as the lack of the right equipment, in the right place, at the right time). Importantly, Reason argued the person was separate from, or outside of, the system, and that errors were also the causes of incidents. He promoted the creation of a Just Culture (procedural justice) to overcome the human tendency to automatically blame people for an error: accountability and culpability was only apportioned after a series of questions had entirely discounted system factors as the cause.

Integrated Rasmussen / Reason categories of Human Error

The figure opposite shows both views of human error are compatible: Rasmussen’s boundaries show error arises from interactions between the status of a system, the resources available to it, and the competence or expertise of those operating within the system. Reason shows the sources of human errors are largely unintentional (mistakes, slips and lapses), but recognises people’s behavioural choices also play a part. Thus, errors can be both consequences and causes! It depends on the circumstances specific to an incident to determine which is pertinent. Optimising a work or task system in conjunction with the workforce is an obvious way of eliminating the presence of human error traps (precursors) that negatively influence people’s behaviour.

Beyond human

The melding of the two approaches shows the importance of incident investigations recognising that human error is often (but not always) the result of interactions between cognitive, system, and situational factors: in other words, both human and organisational factors. As such, it is quite wrong to automatically assert either the system(s) is the sole cause of a human error or the person is solely to blame.

Moreover, it is equally wrong for incident investigations to ascribe human error as an incident cause, without stating what the human error was, whether it was an unintentional mistake, slip or lapse, or an intentional behavioural choice, and which system or situational factor(s) influenced it. In this way, the human error construct becomes much more useful in creating or maintaining a just culture, while promoting organisational learning to prevent repeat occurrences.

Bibliography

  • Cooper, M. D., & Finley, L. J. (2013). Strategic safety culture road map. Franklin, IN: BSMS.
  • Norman, D. A. (1980). Errors in human performance. California Univ San Diego LA JOLLA Center For Human Information Processing.
  • Rasmussen, J. (1982). Human errors. A taxonomy for describing human malfunction in industrial installations. Journal of occupational accidents, 4(2-4), 311-333.
  • Reason, J. (1990). Human error. Cambridge university press.

The Safety Conversation Podcast: Listen now!

The Safety Conversation with SHP (previously the Safety and Health Podcast) aims to bring you the latest news, insights and legislation updates in the form of interviews, discussions and panel debates from leading figures within the profession.

Find us on Apple Podcasts, Spotify and Google Podcasts, subscribe and join the conversation today!

Related Topics

Subscribe
Notify of
guest

18 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
simon cassin
simon cassin
1 year ago

Hi Dom Thank you for your excellent article. Do you think that what you have explained is capable of helping us understand and describe the factors influencing what is commonly understood to be human error without first considering the term’s validity? If an error is standard for humans, how can it be an error? If X performs Y to a criterion commensurate with the expected capability of X how can that be an error? Are we considering the concept of error based on the maximum theoretical capability of X as opposed to its actual ability? There are multiple points to… Read more »

Paul
Paul
1 year ago
Reply to  simon cassin

If X couldn’t perform to the level required to perform Y then I would suggest poor recruitment! Fault of management! I clearly don’t have the teaching of some who frequent these articles but I read a lot on leadership, quality, H+S, behaviour, cognition, production, Why, culture. I find these conversations very interesting and think there is a place in any aspect of business where cross-over of methods and disciplines will benefit individuals’ knowledge that will filter into their business. I have to say I find cognition and behavioural aspects far more interesting than the topic of H+S. Just read into… Read more »

Dom Cooper
Dom Cooper
1 year ago
Reply to  simon cassin

Hi Simon, the validity of the term ‘human error’ has been debated for centuries by much brainier people than me, and they have not been able to resolve to a single definition acceptable to all. I would not attempt to do so here, as I only have a finite time left on the planet which is not enough to see if everyone would accept whatever I came up with – lol! Your question is “:Do I believe the human error debate would benefit from input from other disciplines, such as psychology / sociology / philosophy / economics”. The answer is… Read more »

Tim
Tim
1 year ago
Reply to  Dom Cooper

Excellent article Dom. Can I add though it’s definitely yes for me! (Trying to understand the way we all cock up really helped by reading not pretty mainstream books like Drunk Tank Pink, Culture Code – even biographies like ‘Don’t Tell Mum I Work the Rigs’ (indeed especially the later!).

Terry C
Terry C
1 year ago

If it is human error that causes and accident / Incident, how do you correct it?
In my opinion the Management System should identify the hazards, control the risks and implement the risk control measures.
Read through all the HSE prosecutions that are published. How often is human error identified as the cause? I haven’t found one yet
Basically it is failure of the management system, whether, failure to train, lack of competency, failure to identify the risks, implement control measures, etc

Paul Mahoney
Paul Mahoney
1 year ago
Reply to  Terry C

‘Basically it is failure of the management system, whether, failure to train, lack of competency, failure to identify the risks, implement control measures, etc.’

Isn’t this human error to identify these issues?

I would say its chicken and egg, which came first.

Terry C
Terry C
1 year ago
Reply to  Paul Mahoney

Disagree entirely with your approach. What you seem to be saying is; Human Error is the cause of majority of accidents / incidents. You have to drill down to find the cause/s and potential corrective action/s. Just calling a failure a human error will not help you find the root cause/s and suitable corrective actions.

Paul
Paul
1 year ago
Reply to  Terry C

You will never see a prosecution for human error as there is no value in the HSE bringing the case. Any case brought would nearly always be one they are confident they will be successful and can attribute fault/negligence/breach to a business or suitably high level leader/manager. You cannot eliminate all risks and you are not required to. Any remaining risk you have to identify, control and reduce as far as reasonably practicable. Therefore in almost every business an element of residual risk will remain and as long as that is the case then technically it is always possible that… Read more »

Terry C
Terry C
1 year ago
Reply to  Paul

How many incidents have you investigated where the root cause is solely down to human error? Based on this how do you correct human error?

Dom Cooper
Dom Cooper
1 year ago
Reply to  Terry C

In 2011/12 Alcoa got all their work crews to identify and fix ‘human-error’ traps in the work place (e.g. conditions or situations likely to lead to error). For example, when was an advanced scaffolder, if one of the lads turned up worse the wear from the night before, I would send them back to the yard. Alcoa had a great impact on their incident rate over those two years, but seemed to have stopped after then. It seems consistency in identifying and fixing the Human Error traps might be a good answer to the problem.

Terry C
Terry C
1 year ago
Reply to  Dom Cooper

If in your example ” the scaffolder came to work under the influence” What did the management system say about coming to work “under the influence” Did the Management solely rely on the workforce to monitor, or did they have other controls which they failed to implement, e.g. random drug tests? effectively communicating the D&A policy. In my opinion, a person coming to work under the influence, has taken a deliberate decision to not follow the D&A policy. This is not an human error. A different example would be a person driving a vehicle under the influence. What do you… Read more »

Terry C
Terry C
1 year ago
Reply to  Terry C

Up date; UK PM accepted a fine of £100 for failing to wear a seatbelt. He said it was a “mistake”
https://www.bbc.co.uk/news/uk-politics-64337866

I would estimate that there would have been 3 other people in the car with him, the driver, the armed response officer, from the metropolitan police and the cameraman. I have opinions why no one challenged him but obviously this may not be the same as the facts. I will keep my opinions to myself.

Dom Cooper
Dom Cooper
1 year ago
Reply to  Terry C

Hi Terry, in those days a Safety Management System per se was an unknown (Yes, I am old – lol). Drugs testing consisted of trying the products on offer it seems, and neither did we have harnesses or hard hats: they were not common / standard. We relied on our training (CITB cards were brand new then), our more experienced workmates and our survival instincts.

Paul
Paul
1 year ago

This made for an interesting read, well balanced and reasoned with good level of reference and detail as expected from Dominic Cooper. Puts a nice push back to the fact that error is not always so clear cut

Dom Cooper
Dom Cooper
1 year ago
Reply to  Paul

Thanks Paul. I appreciate the sentiments very much.

Nigel Evelyn-dupree
Nigel Evelyn-dupree
1 year ago

If, getting into the “chain of causation” of, so called, accidents, mishaps and errors doesn’t “fatigue” play a significant part as, pretty sure no one really intended to blow-up or sink their workplace resulting in a number of their colleagues even public being KSI !? Work-stress, cognitive fatigue, performance anxiety and presenteeism without a formal “Right to Disconnect” yet, increasing the risk of errors as shift patterns extended from ‘8’ to ’12’ hours with the expectation that workers will be as functionally or optimally efficient at the end as they were at the beginning of their day or night shift… Read more »

Nick Hancock
Nick Hancock
1 year ago

Hi Dom. As per the other comments, many thanks for another insightful article. I am currently writing my dissertation for an MSc Risk and Safety Leadership, and the focus is on improvement to safety culture. Would you agree that with the right leadership and management processes in place, the safety culture of an organisation could be ‘good’ or ‘positive’ enough to the point where there is such a focus on the prevention of hazards and errors that the actions of an individual are considerably less likely to be the cause of an accident? For example, taking a situation where an… Read more »

Dom Cooper
Dom Cooper
1 year ago
Reply to  Nick Hancock

Hi Nick, I think you have to recognise that a behaviour can overcome and breach the most thorough risk controls in place. I know of somone who had climbed over a 10 ft yellow fence, with warning signs, interlocks on the gates, etc, to keep people out of the area or safe inside it, to clear a blockage. Not a good decision, but it happens. There is an assumption that a really good safety culture can make the difference. I am convinced it can, but it has to be visible in what people do, and hte decisions taken by leadership/managment… Read more »