In a previous post I wrote about the failings common among many organisational decision-making processes.
Prominent among these shortcomings, I suggested, is a tendency to rush to judgment – to act in haste and repent at leisure.
As I remarked, this unhealthy fondness for quick-fix solutions is routinely followed by a reluctance to alter course. It is seldom in our nature, even when we belatedly recognise the error of our ways, to concede our blunder and change tack. By way of a classic illustration from history, consider Napoleon’s invasion of Russia. There may be something to be said for sticking to your guns – but not in the face of overwhelmingly superior firepower.
Napoleon doubtless felt himself accountable to no-one, but for lesser mortals – particularly those within modern-day organisations – the spectre of blame culture is a habitual contributor to such intransigence. When the imprudence of the original decision at last becomes too obvious to ignore, prompting the long-overdue contemplation of numerous “what ifs”, the aim is often not to learn from mistakes per se but rather to heap misery on those who “called it wrong”.
In short, the buck must stop somewhere. Somebody has to carry the can. It falls to individuals to take the rap. This is a highly corrosive philosophy – one that arguably creates far more bad decisions than it claims to address.
Victims of the system?
Britain’s worst-ever rail crash occurred at Quintinshill, near Gretna Green, on May 22 1915. Three trains, one carrying hundreds of troops destined for the battlefields of the First World War, collided outside a signal box. More than 200 people were killed.
A Board of Trade inquiry into the disaster opened just days later. It was a notably swift affair. It concluded that the two signalmen, George Meakin and James Tinsley, were responsible; they were later charged with culpable homicide and jailed at the end of a similarly cursory trial at which their barrister called no witnesses in their defence.
This was blame culture at its most explicit. Scholars have since highlighted a marked disinclination to acknowledge significant organisational flaws, including the use of outdated and inherently unsafe carriages and a blinkered determination to maintain peacetime levels of profitability during wartime. It is hard to imagine the matter could be dealt with so summarily – and, indeed, with such a narrow focus – today.
Yet one of my colleagues, Paul Kirkham, has already written on these pages about another rail crash, which happened in Amagasaki, Japan, on April 25 2005. In that case more than a hundred people died when a train that was just 80 seconds late left the track and smashed into a block of flats.
An official investigation found the driver had been trying to make up time when he lost control. But why was he so afraid? What sort of prospect was so frightful that he was prepared to risk his own life and the lives of his passengers?
Had he survived, he would have been severely fined and ordered to attend an “education” course whose methods were rooted more in humiliation than in re-training. It is perhaps also worth pointing out that even when he realised he was going too fast he applied only the service brake, probably because use of the emergency brake would have merited additional punishment.
Some harsh truths
These tragedies, which occurred more than nine decades apart, show us two things. The first is that we have not come as far as we would like to believe in addressing the problem of blame culture. The second is that broader organisational failings, particularly those that occasion unnecessary pressure, are central to many of the bad decisions for which individuals are subsequently censured.
The same painful lesson is writ large in various high-profile scandals. The Francis Report into the events surrounding Mid-Staffordshire NHS Foundation Trust in the late 2000s famously observed that staff, because they found it easier to stifle their concerns about patient care rather than express them, “did the system’s business”. The sub-prime mortgage crisis resulted from wholesale mismanagement and a collective contempt for risk. Enron championed a code of “rank and yank” and was eventually exposed as utterly dysfunctional and institutionally corrupt. In each case there was an overarching culture that both lent itself to sub-optimal decision-making and did little or nothing to deal with the repercussions.
The bottom line is that most of us are wrong more frequently than we are right. After all, to err is human. Any organisation that cannot bring itself to accept as much invites the erosion of its employees’ willingness to admit to mistakes when they might still be corrected; and in doing so it learns nothing and instead merely perpetuates a damaging cycle of poor choices, negative consequences and debilitating condemnation.
Such an approach was insulting enough in 1915. Now, more than a century later, it is not just hopelessly outmoded: it is thoroughly inexcusable.