We should never forget that we are all human

Written by
Paul Kirkham

20 Sep 2016

20 Sep 2016 • by Paul Kirkham

One of the reasons air travel is the safest form of mass transportation is that the aviation industry is excellent at learning from mistakes. A generation ago fatalities occurred around once in every 140 million miles flown; since then there’s been a tenfold improvement. Every plane crash makes the next flight safer.

Of course, plane crashes tend to be high-profile and costly – both in terms of money and in terms of lives lost – and consumer pressure alone arguably drives action. But it really wasn’t until the last quarter of the 20th century that the industry, galvanised by a string of disasters that showed most tragedies are caused not by technical faults but by people, collectively grasped one of the most valuable lessons of all: that to err is human.

Take Eastern Air Lines Flight 401, which crashed on December 29 1972 after every member of the flight crew became preoccupied with a landing-gear indicator light that hadn’t illuminated. Nobody noticed that the autopilot facility had inadvertently been switched off and that the plane was gradually descending. In fact, there was nothing wrong with the landing gear. The indicator light’s bulb had simply burnt out.

More than a hundred people died when the plane plummeted into the Florida Everglades. The subsequent introduction of crew resource management (CRM), a sweeping new set of principles and procedures, represented a pivotal chapter in aviation history.

Thought process

We might sum up the industry’s response as follows:

  • Mistake acknowledged
  • Lesson learned
  • Problem solved

Compare this philosophy with the culture of so many other domains, whose reaction to failure is routinely characterised by denial, buck-passing and a suspicious keenness to “move on”. We might sum up these as follows:


  • “That’s not what happened.”
  • “I would dispute those figures.”
  • “I can’t comment on individual cases.”


  • “We’re going to get to the bottom of this.”
  • “Lessons will be learned.”
  • “I can’t comment while the inquiry is ongoing.”

Move on

  • “It's a very different situation these days.”
  • “It’s time to draw a line.”
  • “There’s no point in dwelling on the past. We need to look forward.”


In these cases, even if culpability is finally established, the scapegoats identified at the end of a drawn-out process are sometimes no longer even there. They might have left to “seek new challenges”. They might even have taken early retirement to “spend more time with their family”.

An alternative approach ­is to impose swift and terrible punishment on those who blunder. Key Performance Indicators are introduced, and anyone who doesn’t measure up doesn’t progress. This is sometimes known as Prozac leadership: fear of failure is all-pervasive but disguised by unshakable optimism.

The cost can be high, as was sensationally demonstrated by the Enron scandal. Enron exercised zero tolerance of failure with its “rank-and-yank” management policy, but its ruthless monitoring of KPIs encouraged dysfunctional behaviour and institutional corruption. Shareholders lost $74 billion when the corporation went bust, while the CEO got 24 years in prison. This is what can happen when all sense of proportion is lost in enforcing targets.

By way of further illustration, consider the events surrounding a rail crash in Amagasaki, Japan, on April 25 2005. Lateness would have brought the driver severe penalties – not just a financial sanction but attendance at an “education” course – and even when he realised he was going too fast he applied the service brake, probably because use of the emergency brake would have invited further retribution.

The train was just 80 seconds behind schedule when it left the track. Inevitably, the official investigation concluded the driver had been trying to make up lost time. As with Eastern Air Lines Flight 401, more than a hundred people were killed.

The trouble with any organisation in which to err is not human but literally inexcusable is that mistakes are never acknowledged. And mistakes that aren’t acknowledged can’t be learned from. The only lesson that ever really emerges is that one should cover one’s back at all times. This ignores the fundamental truth that learning from mistakes is crucial to how we improve.

Being wrong more often than we’re right may well be the antithesis of the career-development metrics and target-driven culture that characterise much of modern-day working life, but it also happens to be the very essence of creative thinking. We would do well to remember this in an age when the balance between safety and pressure, between support and punishment, is frequently tilted far too much in the latter’s favour.