Informa Markets

Author Bio ▼

Safety and Health Practitioner (SHP) is first for independent health and safety news.
October 27, 2010

Get the SHP newsletter

Daily health and safety news, job alerts and resources

Managing information – Stir of echoes

Any organisation, no matter what industry they operate in, has the potential to become tomorrow’s safety-disaster headline. With the aim of assisting practitioners in avoiding such a scenario, Martin Anderson looks at several organisations that have experienced major safety incidents, and identifies key themes.

The oil, gas and chemical industry comprises some of the world’s largest and most profitable organisations, which often have well-developed safety management systems and many years of operational experience. However, inspections and investigations may identify quite serious organisational failings and precursors to major incidents. Moreover, the findings of major investigations are often similar to those revealed in previous incidents.   

As a starting point, consider that investigations into major incidents in a wide range of industries frequently reveal that information that could have prevented the event, or which identified worrying trends already existed somewhere in the organisation. This information, pointing to ‘trouble brewing’, either does not reach those who could take appropriate action, or is not acted upon. 

Texas City

The Baker Panel report, for example, which followed the explosions at the BP Texas City refinery on 25 March 2005, reported: “BP’s executive management either did not receive refinery-specific information that suggested process-safety deficiencies at some of the US refineries, or did not effectively respond to the information that it did receive.”1
In the years prior to this incident, serious concerns were raised at the refinery as a result of various audits and investigations. However, as Professor Andrew Hopkins states in his book on this incident: “In short, the bad news that leaders needed to hear did not reach them.”2 

This issue is neither unique to BP nor to the oil and gas industry. A key lesson from history is that the organisational causes of major accidents echo those that have occurred before. As investigations into major events unfold, it is possible to draw parallels with other, seemingly unrelated, high-profile events in other industries. It’s worth reminding ourselves of some of these incidents. 

King’s Cross

In November 1987, a fire broke out under an escalator at King’s Cross underground station, which led to a flashover and 31 fatalities. However, this fire was by no means the first on the Underground system; in fact, there had been some 800 fires previously recorded. In his report into the circumstances of the fire, Desmond Fennell QC said: “I have said, unequivocally, that we do not see what happened on the night of 18 November 1987 as being the fault of those in humble places.”3 

NASA space-shuttle disasters

In both of the NASA space-shuttle incidents – Challenger breaking up on launch, and Columbia breaking up on re-entry – there was what has been described as a long incubation period.4 During the launch of Columbia in January 2003, debris breaking off the external fuel tank caused damage to insulating tiles on the shuttle’s left wing. Columbia burnt up on re-entry as hot gases entered the damaged wing structure. However, this was not a new phenomenon, as 79 of the previous 113 shuttle missions involved materials from the external fuel tank impacting (and damaging) the shuttle during take-off.

Sadly, relevant information was not only available on the specifics that led to the loss of Columbia, but also on a previous significant event, in which the Challenger shuttle and all occupants were lost. There were many opportunities to learn from this information, together with the concerns raised by staff prior to Columbia’s final launch. Failure to do so meant that a disaster was inevitable. 

Bristol Royal Infirmary

In a completely different sector, Bristol Royal Infirmary recorded a significantly higher mortality rate for open-heart surgery on children under the age of 12 months than that of other centres in England. Between 1988 and 1994 the mortality rate at Bristol was roughly double that elsewhere in five of these seven years. The subsequent multi-million-pound inquiry, which reported its findings in 2001, was the biggest probe into the workings of the NHS ever carried out.5

As with other similar reports, the findings of the inquiry ran to many hundreds of pages. However, within all of that material, the most striking aspect for me is that questions were asked about the mortality rate for several years. Warning signs were not recognised, and people who raised concerns were ignored and – it was alleged – threatened. Two surgeons and the chief executive were found guilty of serious professional misconduct. 

The whistle-blower, a consultant cardiac anaesthetist, along with some of his colleagues, outlined concerns to the chief executive as early as 1990. He also submitted data on mortality rates to the Department of Health. The anaesthetist’s own words are disturbing: “In the end, I just couldn’t go on putting those children to sleep, with their parents present in the anaesthetic room, knowing that it was almost certain to be the last time they would see their sons or daughters alive.”

But were lessons learnt? During this year’s probe into care provided by the Mid Staffordshire NHS Foundation Trust, chair of the inquiry, Robert Francis QC, stated that the problems with standards of care had been in existence for a long time and were known by those in charge.6

It seems clear from these events that preventative information was available but trends were not recognised, information didn’t reach those in a position to act, or key individuals were not willing, or able to act on the facts. 

Terrorist plots

In a more recent example, on Christmas Day last year, a passenger allegedly tried to ignite explosives aboard Northwest Airlines flight 253. Early investigations revealed that information within the intelligence community could, and should, have been pieced together. In effect, the myriad US security, intelligence and counter-intelligence agencies were unable to ‘connect the dots’. 

In a speech delivered a few days after the plot, President Obama said: “When our government has information on a known extremist and that information is not shared and acted upon as it should have been, so that this extremist boards a plane with dangerous explosives that could cost nearly 300 lives, a systemic failure has occurred. We need to learn from this episode and act quickly to fix the flaws in our system because our security is at stake and lives are at stake.”

Learning from experience is easier said than done. After the terrorist attacks on 11 September 2001, for example, President Bush reported a change in philosophy: “Information must be fully shared, so we can follow every lead to find the one that may prevent tragedy.” However, this did not appear to happen before the Christmas 2009 attack.

The 9/11 Commission reported that information was often assessed on a case-by-case basis, with limited strategic analysis. It recommended new information-sharing systems that transcend traditional government boundaries, based not on the traditional system of “need to know” but around the “need to share”.7 

These examples show that assembling, communicating and acting on information is key to safety performance in a wide range of human endeavours. However, given that some of the world’s largest, most complex, and powerful organisations have failed to accomplish these tasks, this may not be as straightforward as it seems.

The organisations I have visited operate in industries with highly-developed management systems and procedures for internal monitoring, review and independent auditing – so it may be surprising that they still experience major incidents. You have probably heard the expression “an accident waiting to happen”. Unfortunately, many of the major events described above clearly fit that description. 

How to manage information better

The 1979 report into the nuclear incident at Three Mile Island is an early reminder not to become complacent. It stated that “one must continually question whether the safeguards already in place are sufficient to prevent major accidents”.8

The following guidance may help you reflect on how your own organisation manages safety-related information. 

A key component of any safety management system is the feedback loop – checking that all is as it should be and – perhaps, occurring less frequently – asking whether things could be improved. Many organisations are good at initiating programmes and systems, but not particularly good at checking whether they are working effectively, or whether they are the right systems in the first place. 

Auditing is a key tool in any safety management system, but three major failures in its application often render this tool blunt. Firstly, audits tend to portray the organisation in a good light, but should really be looking for, and reporting on, the bad news. There needs to be a change in philosophy on the role of audits – rather than confirming that all is well, they should try to identify learning opportunities.

Secondly, audits are often compliance checks, reporting on whether safety systems and processes are in use, rather than asking whether systems are fit for purpose. Finally, in many organisations, I see safety initiatives that are aimed solely at front-line employees; however, issues that come to light following major accidents tend to relate to wider organisational factors and managerial behaviour. I rarely see auditing that effectively confronts these aspects.

For each safety barrier that you think you have, whether this is physical, procedural, or a system, there are three key questions that must be asked: (i) does it exist? (ii) is it any good? and (iii) is it used? Simply asking these questions is not enough, however, as the results of this questioning must reach those in your organisation with the authority and inclination to act, where appropriate. The findings may suggest underlying failures not with individuals but in the management system, and how your organisation responds to these challenges is key.

Capitalising on the experience and knowledge of those at the sharp end of the organisation is one of the keys to success, however measured. I’m reminded of the recent television programme where business leaders worked undercover in a variety of jobs in their organisations. In all cases, the senior executives learned of issues and concerns that were undermining the effectiveness of their business. Ask yourself:

• Do the leaders in your organisation want to hear what is really happening?
• Is it possible that ‘gatekeepers’ at a certain level in the hierarchy prevent such information from moving up the organisation?
• How do your leaders know that the information reaching them forms a complete and accurate picture of performance in health, safety, quality, service or any other metrics key to the organisation? 

Near misses are often described as opportunities to learn from ‘free’ information – free in the sense that the near miss does not have severe financial and other consequences. However, there is a learning step before even near misses occur – the issues and concerns that might be raised by the workforce are an early warning system.  

Besides a reactive system, there is huge value to be gained from proactive interventions, such as informal focus-group discussions, or site walkabouts by senior managers. Not only will you unearth useful information but these activities will send a message to staff that their views are considered important – which encourages future participation. But you must act, and act quickly, on information gained – either providing feedback that actions have been taken, or explaining clearly why they have not been done.

To encourage an ‘open culture’, you must consider how you treat people who raise issues and concerns. Many organisations state that they have an open-door policy on raising concerns, but this is sometimes perceived by staff to be the door leading off the site! How do you know that staff are not afraid to report their concerns?

Conclusion
From the case studies above, we can see that individual pieces of information need to be assessed to determine whether they are warning signs. This may only be achieved if they are viewed in the context of other information and analysed for trends or relationships. You may not consider ‘information’ as one of your safety systems – but this is an issue that requires review just like any other critical aspect of your business. Collecting the full range of information, from the right people and places, and acting on it appropriately, are key to avoiding catastrophe. 

Senior leaders must not assume on the basis of injury or accident statistics that no issues exist in the organisation that may later lead to a serious incident. The starting point should be an assumption that there are major safety issues confronting the organisation, and various initiatives should be harnessed to identify what issues exist. 

References
1  The Report of the BP US Refineries Independent Safety Review Panel (2007) – www.csb.gov/assets/
document/Baker_panel_report1.pdf
2 Hopkins, A (2009): Failure to Learn: The BP Texas City Refinery Disaster, CCH Australia Ltd
3 Fennell, D (1988): Investigation into the King’s Cross Underground Fire, DoT, HMSO
4 Columbia Accident Investigation Board (2003) – http://caib.nasa.gov
5 Learning from Bristol (2001): The Report of the Public Inquiry into Children’s Heart Surgery at the Bristol Royal Infirmary 1984-1995 – www.bristol-inquiry.org.uk
6 Francis, R (2010): Independent inquiry into care provided by Mid Staffordshire NHS Foundation Trust. January 2005-March 2009, HMSO – www.midstaffsinquiry.com
7 The 9/11 Commission Report (2004): Final Report of the National Commission on Terrorist Attacks Upon the United States – www.9-11commission.gov
8 Report of the President’s Commission on the Accident at Three Mile Island (1979): U.S. Government Printing Office, 0-303-300, Washington DC

Martin Anderson is a specialist in human factors, with a particular interest in organisational failures and cultural issues.

Fire Safety in 2023 eBook

SHP's sister site, IFSEC Insider has released its annual Fire Safety Report for 2023, keeping you up to date with the biggest news and prosecution stories from around the industry.

Chapters include important updates such as the Fire Safety (England) Regulations 2022 and an overview of the new British Standard for the digital management of fire safety information.

Plus, explore the growing risks of lithium-ion battery fires and hear from experts in disability evacuation and social housing.

Related Topics

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments