Informa Markets

Author Bio ▼

Safety and Health Practitioner (SHP) is first for independent health and safety news.
March 21, 2016

Get the SHP newsletter

Daily health and safety news, job alerts and resources

Can accidents caused by human error ever be avoided?

By Mary Clarke, CEO, Cognisco

Thankfully major accidents are pretty rare events these days due to the many safeguards and defences developed by modern technologies and processes. However, they still occur.

On 9 February in Bavaria in Germany, 11 people died and at least 80 more were injured when two passenger trains collided at 100kmh.

Human error is thought to be behind that terrible accident and the man supposedly responsible is facing involuntary manslaughter charges, with a possible five year jail term.

Human error is a term commonly used when companies are unable to blame systems, technology or process when things go wrong.

It occurs when people have failed to act in the right way – they may have taken a short cut to save time, forgotten to correctly follow one step of a process, or even entirely misunderstood the defined process. In each case, their actions can have disastrous consequences.

For health and safety practitioners, it can be a hugely challenging area as there is no easy way to mitigate against human error.

Take the Bavarian crash, why did this seemingly qualified Area Controller decide to open the track to two trains on a collision course only to notify both drivers afterwards?

The Chief Prosecutor Wolfgang Giese said: “if he had complied with the rules…there would have been no collision”.

Perhaps that seems slightly obvious?

When audits and investigations take place, it may transpire that the Controller took (and passed) all the right training, understood the correct processes to follow and was an experienced and good employee – so why did things go wrong?

Most large companies, particularly those in highly regulated industries like rail, invest millions if not billions of pounds embedding systems, technology and process to mitigate risk, uphold and improve safety standards and efficiency.

Yet, mistakes still happen and the root cause is often complex.

James Reason, Professor of Psychology at Manchester University, an expert on the role of human error in accident causation, explains it using his ‘Swiss Cheese Model’ theory (1).

Reason says that 99.9% of the time a company’s systems, processes and people will align and work together with no issues, despite small gaps in some places. However, when holes and gaps in different layers shift and align, everything can change in an instant.

Whilst most people want to do their best for their company, people have always navigated around standard procedures, processes, and best practice.

In some instances they may have been told to side step a process by their manager or a more experienced colleague because it’s ‘quicker’, or ‘the way we’ve always done it’. Such process tweaks can be effective, but when they aren’t embedded effectively, they can, and do, expose companies, on a mammoth scale.

Cognisco calls this exposing yourself to People Risk.

How can companies avoid it?

One of the key challenges is that when an incident like a train crash occurs, companies resolve to do things differently. They pledge to review their processes and invest in more training. However, the problem is that if the process or the training wasn’t the cause of the accident, such initiatives are bound to be fruitless.

Why would a new process work if the old one wasn’t followed? How would more training benefit people who didn’t understand or apply previous training? How will things be different?

A new approach is surely required.

Wouldn’t it be better if organisations had insight into what their people actually understood about their role from the training and other support material provided along with the embedded processes, and whether they are confident enough to be able to apply this knowledge effectively and consistently?

What if such an approach had been applied across the Bavarian rail organisation? Would things have turned out differently? Perhaps with greater insight into the Area Controller’s specific development needs, his company could have given him tailored support and interventions to improve his performance – actions which may have pre-empted the catastrophe.

Every organisation that relies heavily on its people would benefit from insight into gaps in understanding across their workforce. Having this kind of intelligence enables a company to provide tailored learning, training and support for every individual. This would not only avoid ‘sheep dipping’ everyone with the same ‘solutions’ but also enable workforce optimisation.

We are working with many global companies to provide them with real time data about their people’s capability and their confidence delivering against required competencies – something which is helping to transform their businesses.

We have seen that if companies have people-centred data and a global view of the understanding, capability and confidence of relevant teams and individuals, they can more easily identify potential areas of risk and take action to mitigate the risks before problems or accidents occur.

They can also make strategic people-based decisions with confidence, ensure their talent is used in the best way, and use the data to evidence regulatory compliance and reduce the cost of audit and litigation.

Investigations are on-going in Germany and according to reports, Prosecutors do not believe the man acted deliberately, however, he needed to (and did) push the button himself in order to allow the trains through.

Whilst we may never know exactly why the accident happened, we do know that many incidents caused by human error can be avoided and the risk of them re-occurring be significantly reduced.

(1) Human Error – Models and Management –  http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1117770/

Mary Clarke 5 (2)

Mary Clarke is the CEO of Cognisco.

 

 

Related Topics

Subscribe
Notify of
guest

2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Nigel Dupree
Nigel Dupree
8 years ago

No different than any other Risk Assessment in terms of probability. Simples’sss, the greater the deficit in reasonable “basic given operational conditions” the greater the effect on human factors surrounding work related stress, associated with over-exposure, and fatigue predictably increasing error rates and potentially risk of mishap or worse serious consequences of omission.

Ray Rapp
Ray Rapp
8 years ago

Human error is a complex concept which is inextricably linked with design, manufacture, training, supervision and so on. If all these factors were taken into account it could be argued that human error is 99.99 percent the cause of all accidents. Of course it could be argued human error is a sympton – not a causal factor or perhaps the last line of defence in poorly designed systems. The problem is that many organisations do not want to invest in safety – harsh but true. Sure they don’t want accidents and incidents. Nevertheless they are sub-consciously hoping a serious incident… Read more »