About Me

This blog carries a series of posts and articles, mostly written by Anthony Fitzsimmons under the aegis of Reputability LLP, a business that is no longer trading as such. Anthony is a thought leader in reputational risk and its root causes, behavioural, organisational and leadership risk. His book 'Rethinking Reputational Risk' was widely acclaimed. Led by Anthony, Reputability helped business leaders to find, understand and deal with these widespread but hidden risks that regularly cause reputational disasters. You can contact Anthony via the contact form.

Friday, 29 September 2017

Learning from mistakes: the key to flying high



“He hit me first,” whines the indignant two-year-old. We learn the ‘blame game’ young. With luck it develops into asking “why (did he hit me/take my sweets…etc.)?” 

Inadequate investigations

“Why?” is a powerful question. Inexpertly used it leads to quick but superficial attribution of causes: “Why did the rogue trader emerge?” “Because he was bad.” In times past, air accident investigations often concluded that the tragedy was caused by ‘pilot error’. But as Stanley Roscoe, a pioneering aviation psychologist of the 1980s pithily put it, blaming an accident on ‘pilot error’ was “the substitution of one mystery for another”.

At the time air accidents remained uncomfortably frequent with deaths running at around a thousand per year. Roscoe’s insight was a key to transforming aviation from the somewhat hazardous to an activity so safe that the prospect of an aircraft crashing onto London as it approaches Heathrow has barely featured in the debate over a new runway for London’s airports. Terrorism apart, air accidents on western-built aircraft globally killed about 300 people per year in the decade to 2015, by which time the number of flights had more than doubled. For comparison, over 1800 were killed on UK roads in 2013 alone (over 34,000 on US roads).

Aviators learnt to learn better

The transformation was no accident. The airline industry foresaw that growth in flying might lead to a monthly air disaster featured on all front pages if they could not improve safety. As aviation investigators and academics dug deeper into the causes of accidents, asking “why?”, significant themes emerged.

Digging deeper uncovers system failures

One concerned communication failures. High workloads played a part but some were due to hierarchies. A co-pilot needed to tell his commander (in those days pilots were always men) something was going wrong but the difference in status led him to mince words in a way that masked the message; or the message was clear but his commander was unable to absorb information that did not fit his expectations. Sometimes the co-pilot said nothing at all because a challenge was socially unthinkable even when the alternative was imminent death.

The Kegworth crash


The problem grew worse as the gap in status increased, with an even higher barrier between the flight deck crew and the cabin crew even though the latter might have really important information. When the commander of the aircraft that crashed at Kegworth in 1989 announced to all that there was a problem with the right engine, which he was shutting down, many in the cabin could see that it was the left, not right, engine that was on fire. Whilst some cabin crew were too pre-occupied with their emergency duties notice the announcement, there was no attempt to tell the flight crew that the left, not right engine seemed to be on fire. The aircraft crashed just short of the runway when the functioning right engine was shut down; and the left engine’s fire was made worse when extra fuel was pumped into it. 47 died and of 79 survivors, 74 suffered serious injuries.

The pilot who was sucked out of the cockpit

Another theme was system failures. When accidents are investigated there is of course an immediate cause. Soon after a BAC 1-11 aircraft took off from Birmingham airport in 1990, there was a loud bang as a newly installed cockpit windscreen disappeared at 17,000 feet. The co-pilot, who had undone his safety harness, was sucked out of the aircraft and left hanging on by his knees. He was saved by cabin crew holding his legs as the pilot regained control of the aircraft and landed it safely.

The immediate cause was that the windscreen had been installed using bolts that were a mixture of too small in diameter and too short. The next deeper level of causes included a fundamental design error in the windscreen and a mechanic deprived of sleep. But even this was not enough for the investigators who identified fundamental system failings, including that “the number of errors perpetrated on the night of this job came about because procedures were abused, 'short-cuts' employed and mandatory instructions ignored. Even when doubt existed about the correct size of bolt to use, the authoritative documents were not consulted.”

The airline had failed to detect the slipped standards because they did not monitor more senior mechanics. It did not help that their procedure for gathering feedback about the effectiveness of the maintenance system was not working properly: the AAIB estimated that the ratio of near misses to serious accidents might be as high as 600 to one so successful detection of system failures depends on reporting a substantial proportion of near misses.

What aviators learnt

The success of commercial aviation in flight safety is built on two pillars:

  • Analysis of accidents and near misses to their root causes, including system failures including effects of human psychology and behaviour at all levels;
  • Remedying systemic weaknesses and managing behavioural and psychological issues uncovered.

These systemic issues include systemic weaknesses caused by human behaviour: aviators have overcome the idea, common elsewhere, that systems just means processes. Systems do include processes, but recognising that humans are an integral part of their systems, aviators treat normal, predictable human behaviour as an integral part of the flight safety problem and integrate lessons about human behaviour into flight safety.

Practical lessons for all

Thus even the most experienced pilots are taught to listen to subordinates and welcome challenge. Everyone is trained to challenge whenever necessary and ensure they are heard. All are trained to listen to each other and to cooperate, especially under stress. And through what is known as “just culture” the whole commercial aviation system encourages even self-reporting of near-misses and errors as well as accidents so they can be analysed to root causes and the lessons fed back to all. The deal is spelt out on the CAA website:

“Just culture is a culture that is fair and encourages open reporting of accidents and incidents. However, deliberate harm and wilful damaging behaviour is not tolerated. Everyone is supported in the reporting of accidents and incidents.”

This is not whistleblowing to bypass belligerent bosses: it is a routine system that applies to everyone, every day and at whatever level. It applies to all directly involved in flight operations including leaders on aircraft and those who lead the manufacture, maintenance and support of aircraft and the systems that keep them flying. No-one in the system is above it; and the CAA statement of the just culture is endorsement of the flight safety culture from aviation’s highest level: its regulator.

Everyone in the system now accepts it, though it was initially resisted just as Professor Atul Gawende’s surgery checklists were initially resisted by some surgeons. It was no surprise to psychologists that most of the minority who resisted Gawende’s checklists thought that, though they did not need to use checklists, any surgeon operating on them should use one.

The story of flight safety illustrates how carefully thought through culture change has brought about a system so safe that few even think about flight safety. Aviation has achieved this despite the system’s complexity, which includes legions of organisations, huge and small, worldwide.

Applying the lessons beyond aviation

Can it be replicated elsewhere? The fact that airlines – such as British Airways’ recent IT failure – can have serious failures beyond flight safety confirms that the cultural transition between flight safety and the rest of the business is not automatic – even where the group chief executive was once a pilot.

There can be no doubt that senior UK financial regulators understand that cultural, management and leadership failures in and around finance are among the root causes of the 2007/8 financial crisis. Some of these roots – such as the accumulation and promotion of undesirable character traits among staff hired primarily for greed and aggression – go deep. But many, even if not transient, are less deep-rooted.

A better culture, and the incentives and other drivers to support it, can be designed and launched surprisingly fast though embedding it will take longer. Incorporating a culture of learning from errors, near-misses as well as the serious failings in conduct, will help identify systemic weak spots so they can be remedied.

But just as it was crucial that even the most senior pilots learned to welcome analysis and challenge of their actions, so too must business leaders. Their perceived character, culture, incentives and behaviour are crucial models for their subordinates. And just as the CAA overtly underpins aviation’s culture of learning from error, so regulators, and their political masters, must embrace the importance of an open, analytical - and forgiving - attitude to honest mistakes.

Anthony Fitzsimmons
Reputability LLP
London

Anthony Fitzsimmons is Chairman of Reputability LLP and, with the late Derek Atkins, author of “Rethinking Reputational Risk: How to Manage the Risks that can Ruin Your Business, Your Reputation and You

This article was first published in the August/September 2017 edition of Financial World.