People, risk and the blame game

Written by
Changeboard Team

20 Nov 2015

20 Nov 2015 • by Changeboard Team

When things go wrong, more often than not, there’s a desire to blame individuals. Culpability is rarely ascribed on a collective, organisational or institutional level. But what if individuals 'fit in' with what they believe will be accepted – whether by their group, by their bosses or by their company?

Case in point: Mid-Staffordshire. The Francis Report made clear that the basic cause of what happened – or what was allowed to happen – within Mid-Staffordshire NHS Trust was an across-the-board, wholesale failure of culture. The practices that developed were totally inconsistent with the basic obligations of any NHS employee and the professional duties of any clinician. The erosion of acceptable conduct was essentially all-encompassing. In Francis’s words, staff 'did the system’s business'.

This is a classic case of what has come to be known as 'people risk', the central tenet of which is that such incidents can almost always be traced back to internal hierarchies that permit major errors of judgment to go undetected, ignored or suppressed until calamity strikes. Salutary lessons abound in various fields.

What happens when risk is ignored?

Consider, for example, the birth of the US sub-prime housing market collapse in the mid-2000s. This was the result of an astonishing absence of responsible risk controls. With staff pursuing their own destabilising agendas, mismanagement and myopia drove institutional contempt for risk to its apogee. 

Consider, too, the appalling repercussions of the decision to launch the Challenger space shuttle on the freezing-cold morning of January 28 1986. It was allowed to happen because NASA repeatedly reinterpreted evidence of an issue with the O-rings in the vessel’s solid rocket boosters as within the bounds of tolerable risk. Selective blinkeredness culminated in the very public deaths of seven astronauts.

Richard Feynman, the maverick physicist and Nobel Prize winner, famously exposed NASA’s institutional shortsightedness by dropping a chunk of O-ring into a glass of iced water during a televised hearing into the disaster. Commenting on the disconnect he found between the agency’s managers and engineers, he later wrote: “It’s a question of whether, when you do tell somebody about some problem, they’re delighted to hear about it and say ‘tell me more’ or they say ‘well, see what you can do about it’ – which is a completely different atmosphere. If you try once or twice to communicate and get pushed back, pretty soon you decide ‘to hell with it’.”

Another academic who investigated the tragedy, American sociologist Diane Vaughan, coined the phrase “the normalisation of deviance” to describe an organisational situation in which bad habits become standard practice. Again, it’s dispiritingly easy to apply this term here, there and everywhere.

The dangers of a perfect storm

One reason why deviance flourishes in so many settings is that three key strands of business bureaucracy exist in a state of constant antagonism. Upper management either blithely assumes safe policies are being implemented or is content to collude with risky practices that deliver short-term results. Risk assessors are left to cope with the contradictory demands of executives who maintain that greater performance can go hand-in-hand with greater caution. Frontline staff don’t understand – or don’t care to understand – the implications of policies that stop them maximising their earning potential through taking risks.

This can lead to a perfect storm. The organisational fracture between top and bottom is exploited as a matter of course. People act subjectively out of malice, avarice or ignorance. The most cavalier risk-takers are lauded – at least until their actions finally backfire – while whistle-blowers are more likely to be sacked than supported.

The bottom line is that a demotivated and opportunistic workforce will at best look away and at worst abuse the flaws in the system when confronted by a grave risk incident. By contrast, an appropriate work climate and a well-motivated and ethical workforce may help to detect such an incident.

What do we do about it?

One possible answer is to nurture a culture in which responsible behaviour, not its diametric opposite, is rewarded. This requires incentives to be linked to compliance rather than to subversion. Francis hinted at something similar, suggesting a key objective in seeking to prevent a repeat of Mid-Staffordshire should be the fostering of circumstances in which it’s easier to express genuine fears about care quality than it is to stifle them.

The broader lesson is that the raising of concerns should be celebrated rather than censured. As we all know, this remains a goal that’s frequently expressed but seemingly difficult to achieve. It’s an idea we simply have to make work, otherwise the cosy convenience of finding someone to carry the can will forever take precedence over the painful acknowledgment of institutional failings – too often with devastating consequences.