Striving for complete objectivity in safety investigations is futile because hindsight bias – a significant factor in the Buncefield inquiry – can never be eliminated. What, asks Professor Richard Booth, does this mean for our learning of how to prevent incidents from repeating?
“There is a universal tendency to perceive past events as somehow more foreseeable and more avoidable than they actually were,” explains James Reason.1
“Our knowledge of the outcome unconsciously colours our ideas of how and why it occurred. To the retrospective observer, all the lines of causality home in on the bad event; but those on the spot, possessed only of foresight, do not see this convergence.”
People who believe, in good faith, that they have ‘factored out’ the bias of hindsight are mistaken (including the author2), as Fischoff explains in more detail: “The hindsight bias is a projection of new knowledge into the past accompanied by a denial that the outcome information has influenced judgement.”3
The diversity of adverse circumstances, where hindsight bias may have played a part in investigations (and media reports), includes diagnostic errors by physicians and supervision of children at domestic risk. People are often made scapegoats following such events, as Fischoff further states: “Consider a decision-maker who has been caught unprepared by some turn of events and who tries to see where he went wrong
. . . If, in retrospect, the event appears to have seemed relatively likely, he can do little more than berate himself for not taking the action which his knowledge seems to have dictated. . . When second-guessed by a ‘hindsightful’ observer his misfortune appears to have been incompetence, folly, or worse.”3
Buncefield: the immediate events
Hindsight bias is a malign influence on the objectivity of incident investigations and was a major factor in the Buncefield inquiry. To see how this manifested, a reminder of events leading up to the incident is helpful.
The explosion on 11 December 2005 at the Buncefield depot of Hertfordshire Oil Storage Ltd (HOSL) – a joint-venture company split 40:60 between Chevron and Total, the latter of which had responsibility for safety – followed an uncontrolled release of 250,000 litres of petrol from a HOSL storage tank.
The ignition source was a spark from a fire-pump motor, which started up when the emergency shutdown systems were belatedly activated.
The overflow was a result of a failure of an unreliable automatic tank gauge (ATG) level indicator/alarm, and the non-operation of an independent high-level switch (IHLS), which had, as it emerged, a crippling latent defect. Both devices were supplied and maintained by Motherwell Control Systems 2003 Ltd (MCS).
An experienced HOSL night-shift pipeline supervisor mistakenly thought that a low flow-rate pipeline (as was recorded in the day-shift log) was being used to fill the tank, a result of an error by the day-shift supervisor and a cursory shift handover.
The night supervisor, with multiple other commitments and concerns, did not realise that the ATG alarm had failed to operate, or later, that the tank was overflowing. Had he monitored the level in the tank, he would have recognised that the ATG had seized up. With hindsight bias the presumption is that personnel should have spent their time exclusively addressing the issues that subsequently emerged as critical.
The endemic ATG unreliability was poorly communicated to HOSL managers, who were working under intense pressure to finish their COMAH report and deal with an emergency at another Total depot. The control-room supervisors ‘called out’ MCS staff to rectify ATG faults when they occurred, but engineers failed to deal effectively with the problem.
Clearly, the failure of the IHLS to cut off the supply to the tank was the crucial immediate cause of the overflow. However, the pipeline supervisor could not have known about the IHLS’ latent inoperability as no one knew about it. He was doing his best in demanding circumstances, and was unfairly made a scapegoat for his inaction.
Primary responsibility for the faulty IHLS (and ATG) lay with MCS. HOSL staff would have needed extraordinary insight to counter the belief of MCS engineers that padlocks supplied with the IHLSs were merely an optional security provision – in fact, they were essential as a balancing weight. Moreover, after installation, the engineers simply deposited the padlocks in the HOSL control room.
The IHLS defect could only have been detected by an in-situ fully-realistic test. But a test involving the removal of IHLSs from tanks was considered sufficient by all parties. With hindsight bias, of course, this decision appears a culpable error.
The prosecution of Total UK Ltd
The evidence presented by the Crown in the prosecution of Total UK Ltd was replete with hindsight bias. An aggravating factor underpinning the prosecution’s case was that Total should have foreseen a major explosion as a consequence of an overflow, and introduced commensurate preventive measures.
Although it pleaded guilty to sections 2(1) and 3(1) of the HSWA 1974, Total, in my opinion, had good reasons for pleading not guilty, but realised it was unlikely to find favour with the court. Juries are not immune themselves to hindsight bias, while mitigation based on arguments that appear to constitute a defence is only likely to irritate a judge.
However, while its guilty plea was based on a realistic appraisal of the situation, Total lost the opportunity at trial to challenge the evidence of other defendants – notably HOSL, which pleaded not guilty – unsuccessfully – on the grounds that Total was wholly responsible for its operations.
Total was, in my opinion, taking appropriate steps to secure and monitor safety at HOSL. However, these arrangements, supported by commissioned best-practice audits, failed to identify substantial operational shortcomings at HOSL and MCS. In the opinion of the prosecution, the fact that such good or best practice failed to detect the deficiencies in overflow prevention at plant level seems self-evident ‘proof’ that the systems were actually inappropriate and loosely implemented.
In its report from 2008, the Major Incident Investigation Board stated that a major explosion was not considered ‘realistically credible’.4 Before Buncefield the conventional wisdom of the oil ‘majors’, accepted by the regulator, was that the worst tank overflow scenario was a fire contained within a bund (i.e. not a major accident hazard – MAH –, as defined by the COMAH Regulations). After the event, it seems like a catastrophe waiting to happen, which Total should ‘uniquely’ have anticipated and prevented. (Inexplicably, according to the HSE’s COMAH guidance,5 a tank overflow would appear to qualify as a MAH, even if a bund fire was the worst foreseeable consequence.)
Confusion over whether overflow from tanks should be regarded as a MAH under the site’s COMAH regime was an important root cause of the overflow itself. A quantitative risk assessment was not completed, as it was not required for the COMAH report, and overflow prevention was not included in COMAH training. Of course, overflows of flammable liquids are highly undesirable on any terms, but the preventive challenge went ‘below the radar’ largely because of the narrow focus on COMAH MAHs.
System failures and hindsight bias
The weakness of Total’s ‘high-level’ systems to address the failings of MCS, as well as inadequate control-room procedures, lead to a range of hindsight biases. The prosecution argued that the audits were critical of HOSL’s safety systems, and Total and HOSL were censured for not taking the findings seriously.
In fact, the results of the audits were generally positive and were thoroughly reviewed and acted upon. However, it was found that they did not engage sufficiently on day-to-day operational issues. Consequently, a key learning point was hidden by the investigation.
An important lesson from Buncefield is that good or best practice in the safety oversight of subsidiary companies may be weak generally. Total had every reason to believe that its systems were sufficiently robust to ensure safety at HOSL, but the ‘reach’ of their systems was deficient. In order to uncover long-standing but unacceptable local working practices, Total would have had to ‘micro manage’ HOSL’s operation. But micro management of subsidiaries is an anathema in business.
Some of the causal factors in the Buncefield incident also have a long ancestry. Inadequate shift logs and handovers, for example, have been significant causal factors in many major incidents. The fact that these shortcomings are repeated reflects a failure to learn from experience – perhaps equally as bad as learning the wrong lessons. This is a criticism generally of the chemical-process industry.
Flaws in shift record-keeping may be associated with subsets of organisational culture, some of which come down to the custom and practice of individual work groups. The effectiveness of initiatives to improve shift records may therefore depend on changing deeply entrenched beliefs at shift level.
Challenges to learning lessons
The effects of hindsight bias are uniformly adverse in assigning blame and guilt in retrospect in accident cases. But in aiding understanding of the ‘correct’ lessons from an event, hindsight can better inform incident investigators about the future control of risks.
As long as bias is minimised, the simplifications of hindsight are actually an advantage. The sequence of events and conditions leading up to an incident are clearly the most important issues, even if people are blamed unduly for their part. Hindsight bias-aware root-cause investigation approaches provide coherent and simple (but not simplistic) explanations of past events, which can be used to devise appropriate preventive measures. However, the following challenges can threaten such an outcome.
Heightened perceptions of risk: In the immediate aftermath of serious incidents perceptions of risk rise sharply, and new controls may be seen as overly-bureaucratic and disproportionate. Such precautions should be re-evaluated a year, say, after their imposition, when passions have cooled, with a view to dismantling any that are seen as unduly austere.
Inexplicable behaviour: Investigators should ask the question: why did participants believe at the time that their actions were rational? Professor Andrew Hale once told me that his mantra in investigations is: “To go on analysing data until you feel that you, too, in the given circumstances, would have made the same decision, which proved, in practice, to be wrong.” All this counters bias predicated on the fact that blameworthiness is magnified if investigators think that acts were not only unsafe but also inexplicable.
A further difficulty is that investigators do not start from a completely blank canvas. They come to the investigation with beliefs derived from their experience, which, unwittingly, act as the filter through which they examine the facts and draw conclusions. The classic preconception is that accidents are caused by unsafe acts, not unsafe conditions.
Analytical methods for incident investigations: A key approach to minimising bias and promoting foresight is the adoption of structured incident analysis models. My preferences are to use Events and Causal Factors Analysis (ECFA) and Fault Tree Analysis (FTA), sometimes in combination.
The argument runs that the detailed analysis of an incident in linear stages provides a comprehensible structure with which to chart a route that would otherwise be even more opaque and unpredictable.
Conclusion
Hindsight bias is uniformly a harmful influence in the evaluation of bad events, demonstrated not least by the Buncefield explosion. In my opinion, a key cause of the Buncefield explosion, namely the limited ‘reach’ of good or best-practice safety systems, has been masked by the prosecution’s desire, promoted by hindsight bias, to denounce the efforts of Total.
An equally important conclusion is that while hindsight bias cannot be eliminated, hindsight is the essential means for learning from experience. Hindsight bias-aware investigations, where the facts are structured, ‘simplified’ and made comprehensible, are the foundation for coherent efforts to prevent repetitions of adverse events. Incident-investigation training courses that embrace an analysis of hindsight bias, together with the practical use of structured investigation models, may improve the utility of investigations.
The underlying causal factors of major incidents are repeated time and time again. However, it is the root causes, mainly associated with organisational and group culture, as well as conventional wisdoms, that need to be addressed. Prevention ultimately depends on explicit programmes to address root causes – this is what we have yet to learn fully from history.
References
1 Reason, JT (2008): The human contribution: unsafe acts, accidents and heroic recoveries, Ashgate, Farnham, England
2 I was an ‘expert’ engaged to assist the court by Total, which was prosecuted following the incident. My role is a further source of bias to which readers should be alert, and there are, doubtless, others. The opinions expressed here are entirely my responsibility
3 Fischoff, B (2003): ‘Hindsight is not equal to foresight: the effect of outcome knowledge on judgment under uncertainty’, in Quality and Safety in Health Care, (2003); 12: pp304-312 (A reprint of a paper in Journal of Experimental Psychology: Human Perception and Performance, (1975); volume 1, pp 288-299)
4 MIIB (2008): ‘The Buncefield Incident, 11 December 2005. The final report of the Major Incident Investigation Board’ – www.buncefieldinvestigation.gov.uk/reports
5 HSE (1999) (second edition 2006): ‘A guide to the Control of Major Accident Hazards Regulations 1999 (as amended), Guidance on Regulations’, L111, HSE Books
Richard Booth is Professor Emeritus at Aston University, Birmingham and director of HASTAM. He will be speaking on this subject at the IOSH Conference on Wednesday, 7 March.
The Safety Conversation Podcast: Listen now!
The Safety Conversation with SHP (previously the Safety and Health Podcast) aims to bring you the latest news, insights and legislation updates in the form of interviews, discussions and panel debates from leading figures within the profession.
Find us on Apple Podcasts, Spotify and Google Podcasts, subscribe and join the conversation today!
The accusation of “hindsight bias!” can itself be biased.
Assembling the data that describes the circumstances of an accident can be done without bias. Asking why the players did what they did can be done without bias. This is just seeking a fuller understanding of the accident and does not necessarily imply a judgemental stance.
Those who bring the charge of hindsight bias need to explain the how and why of the alleged bias.