SHP Online is part of the Informa Markets Division of Informa PLC

SHP Online is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

January 12, 2023

Get the SHP newsletter

Daily health and safety news, job alerts and resources

Dominic Cooper: To err is human, or is it?

Credit: RTimages / Alamy Stock Photo

Dominic Cooper on whether incident investigations are caused solely by the system or person.

Hands up if you did not follow a rule, forgot something, made a mistake or took a short-cut in the past 24 hours. Did you stop to consider why you had done so? If you did, what was your conclusion? You [a] did not have enough information to do a task properly (failure in planning)? [b] just forgot or became distracted (failure in execution)? [c] tried to save time (made a behavioural choice)? Each explanation points to a category of human error.

Given most behaviours are goal-directed (e.g. make a cup of coffee, travel from A to B, fix a machine), human error is defined as “the failure of a planned action to achieve its goal”. To be termed human error, there must have been a plan, intention or goal, spawning a sequence of actions or behaviours, that somehow failed to achieve the original objective. Some believe errors are the result of a person’s failure to do things correctly, while others state error is always caused by other external factors, not the person.

Eliminate

Currently, it seems, many OSH practitioners want to eliminate accountability and culpability in incident investigations. They assert that all errors and behaviours are entirely system induced. i.e. our work or task systems always ‘compel’ us to do what we do: a person can never be at fault. This is largely due to the influence of Jens Rasmussen, who in the early 1980s stated human error in the workplace should be considered “as man-task mismatches or misfits”. For example, a row of buttons on a control panel that all look alike but each having a different function, can create confusion during a time of stress resulting in an incorrect button press with significant adverse impact. Rasmussen asking why machine interfaces or systems were designed or configured in such ways, asserted that people were an integral system component. He postulated all human errors are caused by external forces that reside in, or impact, the system. These forces included legislation, regulation, the company, its management, staff, the type of work and task design. Thus, he, and others in this cognitive engineering tradition, such as Erik Hollnagel, concluded that errors are consequences rather than causes, therefore, people should not be blamed or held to account, as errors are forced on them entirely by some aspect of the system. Ultimately this led to the socio-technical model of incident analyses, whereby causation is determined by looking outwards from the person to the system(s), the environment and beyond. Typically, this would mean looking at the [system] boundaries [1] of what would be considered an acceptable state of affairs within a [task] system; [2] the individual’s resource profile (competence, skills, expertise); and [3] the available means of work (resources).

Pathogen Model

James Reason, also exploring human error at the same time, released his famous Pathogen Model of incident causation in 1990 (aka Swiss Cheese), which looked inwards to events from the outside (i.e. opposite to the Cognitive Engineering approach). Reason agreed with Rasmussen there were layers of system influence on errors and behaviour, but reduced these to only those within the company, while accepting external influences (e.g. regulators) can impact internal decision-making. The key difference was Reason highlighted how faults in each system layer interacted with the next to produce a holistic view of an incidents trajectory. John Wreathall from the US Nuclear Industry, added to Reason’s model by specifying the sources of system faults at each layer as [1] senior management; [2] line-management; [3] operations support functions (HR, OSH, Engineering, etc.,); [4] people’s on-the job behaviours; and [5] defences (lack of, or ineffective, barriers). Reason asserted unintentional knowledge and rule-based mistakes (Failures in Planning) created latent system faults that lay dormant in the organisation, until triggered by active failures such as unintentional slips and lapses (Failures in Execution) or intentional behavioural choices (e.g. [1] taking short-cuts; [2] ad-hoc optimisation of the way something is done; [3] over-coming organisational problems (such as the lack of the right equipment, in the right place, at the right time). Importantly, Reason argued the person was separate from, or outside of, the system, and that errors were also the causes of incidents. He promoted the creation of a Just Culture (procedural justice) to overcome the human tendency to automatically blame people for an error: accountability and culpability was only apportioned after a series of questions had entirely discounted system factors as the cause.

Integrated Rasmussen / Reason categories of Human Error

The figure opposite shows both views of human error are compatible: Rasmussen’s boundaries show error arises from interactions between the status of a system, the resources available to it, and the competence or expertise of those operating within the system. Reason shows the sources of human errors are largely unintentional (mistakes, slips and lapses), but recognises people’s behavioural choices also play a part. Thus, errors can be both consequences and causes! It depends on the circumstances specific to an incident to determine which is pertinent. Optimising a work or task system in conjunction with the workforce is an obvious way of eliminating the presence of human error traps (precursors) that negatively influence people’s behaviour.

Beyond human

The melding of the two approaches shows the importance of incident investigations recognising that human error is often (but not always) the result of interactions between cognitive, system, and situational factors: in other words, both human and organisational factors. As such, it is quite wrong to automatically assert either the system(s) is the sole cause of a human error or the person is solely to blame.

Moreover, it is equally wrong for incident investigations to ascribe human error as an incident cause, without stating what the human error was, whether it was an unintentional mistake, slip or lapse, or an intentional behavioural choice, and which system or situational factor(s) influenced it. In this way, the human error construct becomes much more useful in creating or maintaining a just culture, while promoting organisational learning to prevent repeat occurrences.

Bibliography

  • Cooper, M. D., & Finley, L. J. (2013). Strategic safety culture road map. Franklin, IN: BSMS.
  • Norman, D. A. (1980). Errors in human performance. California Univ San Diego LA JOLLA Center For Human Information Processing.
  • Rasmussen, J. (1982). Human errors. A taxonomy for describing human malfunction in industrial installations. Journal of occupational accidents, 4(2-4), 311-333.
  • Reason, J. (1990). Human error. Cambridge university press.
[vc_row][vc_column width="2/3"][vc_column_text]

The Safety Conversation Podcast: Listen now!

The Safety Conversation with SHP (previously the Safety and Health Podcast) aims to bring you the latest news, insights and legislation updates in the form of interviews, discussions and panel debates from leading figures within the profession.Find us on Apple Podcasts, Spotify and Google Podcasts, subscribe and join the conversation today![/vc_column_text][vc_empty_space height="15px"][vc_btn title="Listen here!" color="success" link="url:https%3A%2F%2Fwww.shponline.co.uk%2Fthe-safety-and-health-podcast%2F|target:_blank"][/vc_column][vc_column width="1/3"][vc_single_image image="91215" img_size="medium"][/vc_column][/vc_row]
Dominic Cooper: To err is human, or is it? Dominic Cooper on whether incident investigations are caused solely by the system or person.
SHP - Health and Safety News, Legislation, PPE, CPD and Resources

Related Topics

Showing 18 comments
  • simon cassin

    Hi Dom

    Thank you for your excellent article. Do you think that what you have explained is capable of helping us understand and describe the factors influencing what is commonly understood to be human error without first considering the term’s validity?

    If an error is standard for humans, how can it be an error? If X performs Y to a criterion commensurate with the expected capability of X how can that be an error? Are we considering the concept of error based on the maximum theoretical capability of X as opposed to its actual ability?

    There are multiple points to consider relating to this term. Still, unfortunately, in the H&S, the conversation is generally restricted due to the currently accepted parameters and methodology used to evaluate the concept of error.

    My question is, do you believe the debate would benefit from input from other disciplines, such as philosophy and economics? There is a concept called ‘The great rationality debate’. The debate includes perspectives from numerous disciplines. and I believe it is relevant to the human error discussion, but whenever I have raised this concept with influencers in this area, they have all ignored the question and failed to respond or even acknowledge my question.

    Have you heard of this concept? And if you have, do you think it is relevant to the discussion of human error?

    Thanks again for your article.

    Simon

    • Paul

      If X couldn’t perform to the level required to perform Y then I would suggest poor recruitment! Fault of management!
      I clearly don’t have the teaching of some who frequent these articles but I read a lot on leadership, quality, H+S, behaviour, cognition, production, Why, culture. I find these conversations very interesting and think there is a place in any aspect of business where cross-over of methods and disciplines will benefit individuals’ knowledge that will filter into their business. I have to say I find cognition and behavioural aspects far more interesting than the topic of H+S.
      Just read into ‘The great rationality debate’, thank you for giving further reading

    • Dom Cooper

      Hi Simon, the validity of the term ‘human error’ has been debated for centuries by much brainier people than me, and they have not been able to resolve to a single definition acceptable to all. I would not attempt to do so here, as I only have a finite time left on the planet which is not enough to see if everyone would accept whatever I came up with – lol!

      Your question is “:Do I believe the human error debate would benefit from input from other disciplines, such as psychology / sociology / philosophy / economics”. The answer is probably, yes. Why probably? Because multiple disciplines have already had an input over the centuries, and we still are trying to resolve some issues, as my article points out “is it an inside-out view, or an outside-in view?” I argue both are compatible. Others likely will disagree.

      The question of whether humans are rational or not in relation to human error would produce some interesting answers I suspect, where the answers would probably reflect disciplinary world-views, meaning a definitive answer is unlikely – lol.

      • Tim

        Excellent article Dom. Can I add though it’s definitely yes for me! (Trying to understand the way we all cock up really helped by reading not pretty mainstream books like Drunk Tank Pink, Culture Code – even biographies like ‘Don’t Tell Mum I Work the Rigs’ (indeed especially the later!).

  • Terry C

    If it is human error that causes and accident / Incident, how do you correct it?
    In my opinion the Management System should identify the hazards, control the risks and implement the risk control measures.
    Read through all the HSE prosecutions that are published. How often is human error identified as the cause? I haven’t found one yet
    Basically it is failure of the management system, whether, failure to train, lack of competency, failure to identify the risks, implement control measures, etc

    • Paul Mahoney

      ‘Basically it is failure of the management system, whether, failure to train, lack of competency, failure to identify the risks, implement control measures, etc.’

      Isn’t this human error to identify these issues?

      I would say its chicken and egg, which came first.

      • Terry C

        Disagree entirely with your approach. What you seem to be saying is; Human Error is the cause of majority of accidents / incidents. You have to drill down to find the cause/s and potential corrective action/s. Just calling a failure a human error will not help you find the root cause/s and suitable corrective actions.

    • Paul

      You will never see a prosecution for human error as there is no value in the HSE bringing the case. Any case brought would nearly always be one they are confident they will be successful and can attribute fault/negligence/breach to a business or suitably high level leader/manager.
      You cannot eliminate all risks and you are not required to. Any remaining risk you have to identify, control and reduce as far as reasonably practicable. Therefore in almost every business an element of residual risk will remain and as long as that is the case then technically it is always possible that an accident could take place. And that is where you may find that it could have been a human lapse. The business did not fail to manage if it did all that it was required that was reasonable. If the business could do no more, does that by default make it a human error?

      • Terry C

        How many incidents have you investigated where the root cause is solely down to human error? Based on this how do you correct human error?

    • Dom Cooper

      In 2011/12 Alcoa got all their work crews to identify and fix ‘human-error’ traps in the work place (e.g. conditions or situations likely to lead to error). For example, when was an advanced scaffolder, if one of the lads turned up worse the wear from the night before, I would send them back to the yard. Alcoa had a great impact on their incident rate over those two years, but seemed to have stopped after then. It seems consistency in identifying and fixing the Human Error traps might be a good answer to the problem.

      • Terry C

        If in your example ” the scaffolder came to work under the influence” What did the management system say about coming to work “under the influence” Did the Management solely rely on the workforce to monitor, or did they have other controls which they failed to implement, e.g. random drug tests? effectively communicating the D&A policy. In my opinion, a person coming to work under the influence, has taken a deliberate decision to not follow the D&A policy. This is not an human error.

        A different example would be a person driving a vehicle under the influence. What do you think a judge would say when the defence was, “human error”.

        In today’s news The UK PM has been picture not wearing a seat belt, whilst a passenger, in a moving vehicle. His excuse is an error of judgement?. How do you prevent such an error? A possible technical solution; a car won’t move until all seatbelts are engaged. The majority of cars have warning lights when seat belts are not engaged, also the professional driver didn’t check all persons where belted up. I presume the driver had a rear view mirror? Hierarchy of control comes to mind.

        • Terry C

          Up date; UK PM accepted a fine of £100 for failing to wear a seatbelt. He said it was a “mistake”
          https://www.bbc.co.uk/news/uk-politics-64337866

          I would estimate that there would have been 3 other people in the car with him, the driver, the armed response officer, from the metropolitan police and the cameraman. I have opinions why no one challenged him but obviously this may not be the same as the facts. I will keep my opinions to myself.

        • Dom Cooper

          Hi Terry, in those days a Safety Management System per se was an unknown (Yes, I am old – lol). Drugs testing consisted of trying the products on offer it seems, and neither did we have harnesses or hard hats: they were not common / standard. We relied on our training (CITB cards were brand new then), our more experienced workmates and our survival instincts.

  • Paul

    This made for an interesting read, well balanced and reasoned with good level of reference and detail as expected from Dominic Cooper. Puts a nice push back to the fact that error is not always so clear cut

    • Dom Cooper

      Thanks Paul. I appreciate the sentiments very much.

  • Nigel Evelyn-dupree

    If, getting into the “chain of causation” of, so called, accidents, mishaps and errors doesn’t “fatigue” play a significant part as, pretty sure no one really intended to blow-up or sink their workplace resulting in a number of their colleagues even public being KSI !?

    Work-stress, cognitive fatigue, performance anxiety and presenteeism without a formal “Right to Disconnect” yet, increasing the risk of errors as shift patterns extended from ‘8’ to ’12’ hours with the expectation that workers will be as functionally or optimally efficient at the end as they were at the beginning of their day or night shift presuming sufficient quality of rest and sleep in between shifts.

    The old days of “time and motion” studies have long been ignored as regular structured break time every ’90’ minutes from intensive activity, tea break morning and afternoon with an hour for lunch away from workstation to eat, socialise, complete personal calls and/or quick shopping pick-up or drop-off jobs returning to work after a sufficient cognitive or physical break in activity to potentially recover and perform as well in the p.m. as the a.m.

    Then of course, in the 21st Digital Century, there is the question of “Work Exposure Limits” coming ‘2’ the ‘4’ exacerbated by increasing addiction to inaccessible unmitigated display screen devices resulting in visual repetitive stress injuries myopic and asthenopic disease.

    By 1998 we had the PUWER Act addressing reasonable adjustments and accommodations to prevent or mitigate occupational health risks linked to operator-equipment safe usage however, regardless of HSE Better Display Screen RR 561 2007 and/or 2016 WHO ICD-10 actually addressing “Product Safety” had to wait until 2018 and Accessibility Regulations.

    The the displays screen equipment could be infinitely adjustable to accommodate the user operators visual ergonomic needs nevertheless, it has only been those with pre-existing impairments making reasonable adjustments leaving those with induced repetitive stress injuries to suffer debilitating eye-strain, binocular vision stress, blurred or worse double vision, migraine with a high proportion of the 58% experiencing other repetitive stress injuries MSD’s / MSK’s.

    And that’s without getting into other sources of psychosocial stressors in the working environment driving presenteeism and average 20% lost efficiency or productivity.

  • Nick Hancock

    Hi Dom. As per the other comments, many thanks for another insightful article. I am currently writing my dissertation for an MSc Risk and Safety Leadership, and the focus is on improvement to safety culture.

    Would you agree that with the right leadership and management processes in place, the safety culture of an organisation could be ‘good’ or ‘positive’ enough to the point where there is such a focus on the prevention of hazards and errors that the actions of an individual are considerably less likely to be the cause of an accident?

    For example, taking a situation where an individual intentionally causes an accident, such as ignoring safety measures on machinery, do you believe it is possible for the organisational measures to ever be sufficient enough to prevent a scenario like that from occurring? Clearly with that example, there would need to be a strong investigation into the motives behind the action, but is there a hypothetical point where there is no more an organisation can be doing to prevent that individuals’ actions?

    I would imagine my question is more akin to “Do HROs with a ‘good’ safety culture prove Rasmussen’s theory owing to sufficient safety management to essentially prevent accidents?”

    • Dom Cooper

      Hi Nick, I think you have to recognise that a behaviour can overcome and breach the most thorough risk controls in place. I know of somone who had climbed over a 10 ft yellow fence, with warning signs, interlocks on the gates, etc, to keep people out of the area or safe inside it, to clear a blockage. Not a good decision, but it happens. There is an assumption that a really good safety culture can make the difference. I am convinced it can, but it has to be visible in what people do, and hte decisions taken by leadership/managment etc. It is not easy to maintain a consistency of focus, purpose and execution, but there are company’s who do (ExxonMobil springs to mind) but incidents still occur.

      I am not aware of any HRO organisation being incident free. In fact, an HROs mantra’s is often ” fail often, but fail safely”. i.e. use their near-misses as the means to identify ‘gaps’ in the prevailing controls and fix them. How many organisations investigate every near-miss: not many!

      Moreover, an HRO organisation is not necessarily one with a good/great safety culture. Do not be seduced by that notion. They may have great systems, but terrible leadership and management. Remember the triad: system-behaviour-psychology. They are reciprocal. Being good in one area does not automatically mean being good in the other two. That is why we do safety culture assessments.

Leave a Comment
Cancel reply

Exit mobile version