CPD – Behavioural Health and Safety
To explain why, in relation to health and safety, people do what they do Dr Tim Marsh offers a model we can’t refuse.
At social events, when people find out you’re a psychologist the first thing they ask (after “what am I thinking, then?”) is “why do you do what you do?” The more of an effort you make to answer that in a user-friendly way, the more likely they are to say “that’s just common sense, that!”
To be awkward back (when you know the person asking isn’t doing so out of interest), you could try something like: “Basically, it’s a dynamic interplay between antecedents and consequences, genetics, dispositions, external stimuli like totems, perceptions, beliefs, values and societal and local norms — all of which are moderated to a greater or lesser extent by the freedom afforded by environmental factors and constraints. . .”
Take a slower walk through that supercilious response, however, and you realise that a common-sense holistic approach tailored to the world of health and safety is possible. Three recent challenges took me back to first principles: I helped moderate a debate on the Deepwater Horizon disaster in front of a large and knowledgeable audience, was involved in a forum on urban cycling safety, and was then presented with perhaps the most difficult behavioural challenge I will ever encounter.
In simple terms, this article proposes that for a robust holistic model we just need to blend Ajzen’s model of ‘planned behaviour’1 with Reason’s ‘Just Culture’ model2 and good old-fashioned ‘ABC analysis’. (If ‘Just Culture’ and ‘ABC analysis’ are unknown to you, see my 2010 article3 in these pages.)
Such a model is, I believe, applicable to anything that involves people and, in particular, the two infamous ‘elephants in the room’ in the UK: health and road safety. (I’ll try to illustrate this assertion with three case studies representing health, safety and ‘health and safety’: exposure to drugs in a laboratory, cycle safety, and the Deepwater Horizon explosion, respectively).
The holistic model of individual action
My adaptation of Ajzen’s academic model of planned behaviour (see Figure 1) suggests that in order to understand why a person has done what they have done we need to know three things. As with all dynamic models these issues interlink and overlap but a systemic consideration of each will help ensure nothing vital is missed.
The first thing we need know about, of course, is the person — what is their attitude in general and to the situation at hand specifically? For example:
- Are they health conscious?
- Are they fatalistic?
- Are they trained?
- Are they experienced?
- Are they easily tempted?
- Are they well-led and well-coached?
We can impact on this at selection, at induction and training, and with awareness-raising.
The second factor is control, or perceived control of the environment.
- (Starting with) Any weakness in the above list?
- Are there any physical barriers to them acting appropriately?
- Do they think they can act appropriately?
- Can we ‘nudge’ them to act more appropriately with a clever tweak to the environment?
Here, we need to consider the environment from a Just Culture/human factors/ ergonomic perspective and we can impact on the situation with ergonomic and organisational changes.
The third and final factor considers norms:
- What are the relevant norms in society generally?
- What are the norms among this person’s peers?
We know that the best definition of culture is ‘what happens around here on a typical day’ and that it’s incredibly powerful.
Indeed, this ‘norm’ element explicitly links the person and the environment and is what makes Ajzen’s model so strong, as we all know that we can only change norms via systemically addressing the individual and/or the organisation. Some relatively advanced models of human factors lead you to ‘it’s an unintentional error — get an ergonomist’, or ‘it’s a cultural issue, we need to look at leadership’. This model nudges us to think that for a comprehensive solution we need to look at leadership and get an ergonomist!
An example of the interaction
In the film ‘The Godfather’ the decision of the character Michael Corleone to shoot his father’s rival and a corrupt policeman illustrates how these factors overlap and interplay — and hopefully gives the editor a cracking photo-illustration opportunity! (Cheers, Tim — Ed).
Though he is a trained and skilled killer as a result of his WW2 army experiences Michael is influenced by society’s norms in general and considered the ‘war-hero civilian’. His brother Sonny is very much the prince in waiting, as far as the ‘family business’ is concerned. However, Michael finds himself in a situation where he realises that drastic action needs to be taken to protect his father, that he personally is capable of taking that action and (crucially) understands that only he has the opportunity to take this action.
He makes the shift to operating from the local norms of the Corleone family — the corrupt policeman and his father’s rival come to a sticky end — and it all makes for a hugely entertaining and Oscar-winning film. Note that Sonny lags behind Michael in his thinking initially — still seeing Michael as a civilian, so he laughs dismissively at the suggestion — but he then steps back, looks at the bigger picture and sees it makes sense. (The use of humour in this way will make an important reappearance in a short time when we discuss Deepwater Horizon, but that’s as far as I can stretch this metaphor!)
To leave Hollywood and return to the three practical studies introduced above:
The behavioural health problem to end them all
I recently spoke to a company that makes highly effective cancer-fighting chemicals. These drugs are hugely powerful — with the active element of the drug in question so diluted its concentration levels are close to that claimed for homeopathy. The problem is that as well as not being visible to the naked eye we haven’t got the technology to swab and screen. (The company scientists joked, in classic Monty-Python style: “Radiation? Pah — how easy is radiation?! We can make it click loudly and set off alarms at levels not even all that dangerous! I wouldn’t even get out of bed for radiation.”)
Worse, the lag time between exposure and illness is years (usually decades) and, of course, there’s a huge financial imperative. This isn’t the Atomic Weapons Establishment run by a government department and which will still be here, accountable and able to be sued 30 years from now. The suits — who aren’t scientists, let alone hygienists — running this company now will be long retired to the golf courses of Spain and Florida (or in the Lords) when the crap hits the fan.
Being faced with this ‘perfect storm’ of a behavioural problem inspired me to go back to first principles. How exactly do we get the technicians in these labs to follow protocols that will keep them safe?
Starting with control, we have to ask from a ‘Just Culture’ perspective whether or not the technicians have the tools to implement the training they have been given and understand the risks. (Actually, yes they do.) Next, we can ask if it is actually possible to follow the protocols given the time constraints — i.e. a ‘situational’ violation (again, yes it is). Finally, is there unspoken pressure to finish up and turn to something the organisation values and rewards more strongly than hygiene — what Reason would call an “optimising” violation. (Here it gets interesting, and we’ll return to this.)
What there is most obviously, it seems, is an ‘individual violation’ — a classic ABC situation, where there are no short-term consequences at all for a little corner-cutting, or lack of thoroughness. (They can’t even set off alarms and ‘beep’ as they are screened. The work surfaces look spotless even if they aren’t, and no one looks or feels the slightest bit ill — not even after working there for years.)
My analysis of the problem is that although the workers are ‘fully’ trained and therefore worried about their health they aren’t as out-and-out paranoid and scared as they should be! Thus, a two-pronged attack was suggested. The first was to deliver some very intense and visceral training illustrated with the effects of contamination they could never forget. In other words, to completely maximise the “why” element of training as well as the “what” and “how”, as I have previously described in these pages.5
The second element was to follow this up with a series of intense observation and feedback sessions regarding the decontamination protocols and using trained coaches to fully embed the behaviours requested. In other words, treating the health protocols with the intensity it would require to treat a key set of behaviours that could cause a catastrophic crash in share prices next week.
This intensity of follow-up needs to be maintained long enough for the protocols to genuinely become the norm — ‘part of the way we do things around here’ — and, therefore, partly self-sustaining. However, they must still be followed up often enough for this standard not to degrade. This investment is vital and this is where management’s long-term commitment comes in. The point I want to make is that since this is, under the current circumstances, exactly what is required to tackle this health issue effectively then anything less than this is, indeed, the “optimising” violation I alluded to above.
The second practical example I’d like to address is the multi-faceted attempt to get more people using bikes to get to work for the various health and congestion-easing benefits. As well as ‘Boris bikes’ and promotional campaigns there are also ‘bikemobility’ lessons for older riders to improve their skills and confidence. There are technological advances, such as warning sensors on lorries and side-protection engineering. There are awareness-raising campaigns, where cyclists sit in cabs and are sneaked up on via blind spots and frightened by a slap on the window. Amusingly, there are even squads of terrified lorry-drivers being taken out into London traffic on bikes.
Further up the safety hierarchy there are experiments with bike lanes and bike-friendly roundabout designs, as used in countries like Holland and Denmark. All of which are aimed at:
- making individuals feel more positive about cycling;
- reassuring them that they can achieve the health benefits safely; and
- turning urban cycling into something that is very much the norm — a part of UK life in a way that it is in Denmark and Holland.
This is a holistic approach that evolves piecemeal rather than by design. However, with different factions arguing still, a holistic approach is needed. For example, it is suggested that night deliveries will ease congestion but the shift systems required will mean that, inevitably, many of the drivers will have their physiology screaming ‘I should be asleep!’ at them. This isn’t ideal when you’re behind the wheel of a 55-ton lorry.
Deepwater Horizon — a catastrophic process-safety failure
Most people know what happened on the Macondo well: the ‘mud’ seal failed and there was all sorts of evidence to let the team there know it had failed but they missed it — even dismissed it. Most infamously, perhaps, they dismissed pressure readings that made it clear that something was wrong as the ‘bladder’ effect (there is no ‘bladder’ effect).
BP wasn’t alone here, of course, and it should be said it was rather ‘scapegoated’ (President Obama being the first person for decades to refer to the company as British Petroleum) but its employees certainly made some basic mistakes, too. How could they do that so soon after Texas City and the findings of the Baker Report ringing in their ears? How could they have been part of a high-level safety visit just the day before and not spot there was something wrong? Well, very easily, as it happens.
I’ll just pick on some of the pertinent points to illustrate. Starting with the individuals, we can see that although they were described as fully trained and qualified, what they actually were was highly experienced and with impressive-sounding job titles that might better have been described as ‘foreman’. None had a degree in engineering, or any training in lateral or critical thinking — or in the ‘non-technical’ skills of dynamic problem-solving in teams. On Macondo, they could — even should — have consulted with the engineers on the ‘beach’ but, in practice, the norm of self-sufficiency meant that they rarely did.
Interestingly, it was actually the BP man who was most concerned and who challenged the ‘bladder-effect’ explanation of the warning readings. His peers found this amusing and ‘robust humour’ was involved — basically, they took the rise out of him for his ‘timidity’. Teasing is, of course, a key lever of ‘groupthink’ and he fell in line with the local norms, and a classic group ‘risky shift’ occurred (this being the mechanism by which a group, seeing reassurance in numbers, makes a riskier decision than an individual would). Addressing this, BP now has a new protocol that the team leader consults with the team, and then retires to make an executive decision for which they are responsible.
It’s also worth considering the role that leading indicators played in the explosion. Following Texas City, BP rolled out some excellent process safety-specific lead measures. Unfortunately, these were ideal for a production platform but required tailoring for a drilling platform. The financial imperative remained strong, and we know that ‘what gets measured gets done’ and directs attention. In consequence, although personal safety was excellent, the lack of specific lead measures meant that the risk of an explosion on a drilling platform was, with the benefit of hindsight, a blind spot.
In a previous article on Reason’s overstretched elastic-band model6 I made the case that the best organisations spot quickly when they are drifting towards vulnerability and have mechanisms to snap back quickly. For example, a really obvious issue to focus on re: a drilling explosion would be to look at ‘kicks’ (an increase in pressure in the well that will precede a blow-out). Ideally, there would have been data collected regarding how quickly these were spotted and how effective the response.
In addition, during the site visit, the experienced visitors had several opportunities to initiate conversations that may well have identified the problem, but they kept to personal safety issues. Why? Because challenging such competency was considered rather insulting; it simply wasn’t what they did during these visits.
Therefore, a number of warning signs were missed and the famous ‘gorilla on the basketball court’ was able to walk straight past. The workers weren’t looking for gorillas with a healthy paranoia, as described by Andrew Hopkins;7 they’d already decided everything was alright and could only see things that confirmed that mindset. They were looking forward to getting off the rig and some down time/the next challenge.
In short, on Macondo, there were unintentional errors caused largely by environmental cues, mixed with individual issues and local norms. As they always do.
In previous papers, I’ve suggested that predicting organisational accident rates is easy if you know three things: how strong a company’s systems are; how strong the transformational safety leadership is (coaching, praising, leading by example and involving); and how committed the organisation is to mindfully learning about its weakness from a ‘Just Culture’ perspective. This paper suggests that predicting individual behaviour with a similar triptych makes for a robust human-factors model that can be applied to any situation.
We simply need to understand the individual’s knowledge, mindset and disposition. We need to understand which environmental issues can impact on individuals and which factors might actually prevent an individual from acting appropriately. Finally, we need to understand the cultural link between the two — the societal and local norms.
People are very unpredictable — but often in a very predictable way.
1 Ajzen, I (1991): ‘The theory of planned behavior’, in Organizational Behavior and Human Decision Processes 50 (2):179—211
2 Reason, J (1997): Managing the Risks of Organisational Accidents, Ashgate
3 Marsh, T (2010): ‘It’s a kind of magic’, in SHP Sept 2010, Vol.28 No.9, pp40-42
4 Dekker, S (2008): The Field Guide to Understanding Human Error, Ashgate
5 Marsh, T (2012): ‘Go forth and multiply’, in SHP Aug 2012, Vol.30 No.8, pp45-48
6 Marsh, T (2010): ‘Stretch to the limit’, in SHP Feb 2010, Vol.28 No.2, pp39-42
7 Hopkins, A (2012): Disastrous decisions, CCH Australia
Dr Tim Marsh is managing director of Ryder-Marsh Safety Ltd.
Continuing professional development is the process by which OSH practitioners maintain, develop and improve their skills and knowledge. IOSH CPD is very flexible in its approach to the ways in which CPD can be accrued, and one way is by reflecting on what you have learnt from the information you receive in your professional magazine. By answering the questions below, practitioners can award themselves credits. One, two or three credits can be awarded, depending on what has been learnt — exactly how many you award yourself is up to you, once you have reflected and taken part in the quiz.
There are ten questions in all, and the answers can be found at the end of this article. To learn more about CPD and the IOSH approach, visit www.iosh.co.uk/membership/about_membership/about_cpd.aspx
1 Which models does the author suggest be combined for a holistic approach to behaviour? (Tick all that apply)
a) Ajzen’s model of planned behaviour
b) The Hale and Hale model
c) ABC analysis
d) Reason’s ‘Just Culture’
2 In order to understand why a person has done what they have done, what three things do we need to know? (Tick all that apply)
a) The person
b) Perceived control of the environment
c) The salary level of the person
d) The norms in existence
3 What would we NOT need to know about the person?
a) Whether they have been trained
b) Whether they are health conscious
c) Their career aspirations
d) Whether they are easily tempted
4 Whether the person thinks that they can act appropriately would be classed as what type of factor?
a) Their degree of job satisfaction
b) Perceived control of the environment
c) Societal norms
d) Shift patterns
5 In addition to norms among the person’s peers, we would also need to take account of norms in society generally:
6 Not following protocols because of time restraints would be classed as what?
a) An optimising violation
b) A rule-based error
c) A situational violation
d) An individual violation
7 What would NOT be part of an effective strategy for encouraging urban cycling?
a) Making individuals feel more positive about cycling
b) Putting in strict controls on cyclists
c) Reassuring individuals that they can enjoy the benefits safely
d) Turning urban cycling into a norm
8 Why might a group make riskier decisions than an individual?
a) They see themselves as more powerful
b) They see other groups making decisions
c) They see reassurance in numbers
d) They have no concern for safety
9 Why might visiting safety inspectors avoid questioning what workers are doing?
a) They only have to be concerned about documentation
b) It is not their concern
c) They do not wish to question people’s competence
d) Nobody expects them to ask
10 Organisations can predict their accident rates from the perspective of: (Tick all that apply)
a) How strong the organisation’s systems are
b) How strong the safety leadership is
c) The financial costs of accidents
d) The number of people employed
1. a, c, d
2. a, b, d
10. a, b
SHP’s brand new latest legislation eBook covers recognition of mental health issues in the workplace, the reclassification of mild welding fume as a human carcinogen, new manslaughter definitive guidelines, PPE, Brexit, drone safety regulations and much more…