Volkswagen, scion of German industry, has fallen off its pedestal after revelations that up to 11 million of its diesel cars were fitted with software designed to deceive regulators about levels of toxic NOx emissions.
It all began to unravel last in May 2014 when Virginia University's Center for Alternative Fuels, Engines & Emissions (CAFFE) discovered that NOx emissions from two of three diesel-powered cars it was testing produced between 5 and 35 times more than the official emissions standard, though in laboratory conditions they complied. This was the first whiff of rat.
According to a Californian Air Resources Board (CARB) letter of 18 September, CARB took up the case and pursued investigations. Discussions ensued with VW, who solemnly carried out tests and provided fixes, which did not work when CARB tested them.
As CARB relates in its letter, on 3 September, after a year's prevarication, VW confessed. As the world now knows, large numbers of VW's diesel-powered motor cars were "designed and manufactured with a defeat device to bypass, defeat or render inoperative elements of the vehicle's emission control system" whilst meeting official testing procedures.
When the news became public, on 18 September, it was met with a mixture of incredulity and a drop in the VW share price of about 25%. After defending an almost indefensible position for almost five days, VW's Chief Executive Martin Winterkorn resigned. Some began asking whether VW would survive.
Beyond VW there was collateral damage. The share prices of other car makers dropped, though less than VW's. A discussion began as to whether diesel technology, held responsible for large scale health problems, was a wrong turning that should be abandoned. And questions were asked whether, after a long string of governance failures in Germany, German industry was really the paragon represented by its good reputation.
But the intriguing question is: why did this happen?
We do not yet know what happened but it is not unknown for major corporate scandals to have their genesis in the board room. Think of the Olympus and Toshiba scandals. That said there is no evidence of active boardroom involvement and Mr Winterkorn's protestations of surprise imply that he first learned of the problem very late in the day. But the composition of the supervisory board, which does not seem to have been chosen with skill sets as its primary concern, has echos of the boards at Airbus at the time of the A380 crisis and at the UK's Co-operative Group when it almost collapsed.
If that is so, he must have been in the dark, unaware of what seems to have been a piece of deliberate skulduggery going on under his nose - but without his knowledge.
This sadly is a very common state of affairs. We call it the Unknown Knowns problem. There are things that leaders would dearly love to know - but they cannot find out until it is too late. In our research, 85% of leaders were taken by surprise when a serious crisis engulfed their company. Yet most of these crises were caused by systemic failures that had lain unrecognised for years, sometime decades.
The most telling example is that there have been at least 14 rogue traders, averaging one every eighteen months, since Nick Leeson broke Barings in 1995: most recently the London Whale, who breached in 2012. JP Morgan sustained losses of $6 billion on positions said to amount to about $160 billion. All the rogue traders operated in an environment where risk teams are huge: JP Morgan's risk team ran to thousands but they didn't spot the Whale; nor did the astute Jamie Dimon - until an even more astute hedge fund spotted his problem from the outside and began to trade on his misfortunes.
What seems to happen is that some combination of character, culture, leadership, targets, incentives, corner-cutting, complexity, groupthink - and the slippery slope from gently bending rules to breaking them - leads an individual or team to start doing something that, as Warren Buffett put it, you "wouldn't be happy to have written about on the front page of a national newspaper in
an article written by an unfriendly but intelligent reporter." It doesn't help if regulators are not robustly independent.
Once the wrongdoing has begun, it is very hard for participants to confess - doing so will probably lead to unpleasant sanctions - so it continues. The hole gets deeper.
The wrongdoing is rarely known just to the participants. Others usually know, but are unwilling to rock the boat. This may be because the wrongdoer has higher status; or it may be because they are in the same 'tribe', but as time passes unwillingness becomes tacit complicity. Many may know things are wrong but they won't tell anyone above them. Often the root causes are visible to the thoughtful, perceptive outside observer, such as a hedge fund or professional investor. We call these companies 'predictably vulnerable'.
There may be a potential whistle blower; but anyone who researches whistle blowing as an activity will discover that it is commonly terminal if not merely frustrating. It takes courage and determination to blow the whistle; and it takes an exceptional leader to listen and understand what a whistle blower is alleging with an open mind.
This is one of the ways in which leaders find themselves in the dark. Breaking this silence is difficult. It takes an insider-outsider, as anthropologists term it, armed with trustworthiness, skill and understanding of human behaviour to learn what insiders think and know but won't tell. A sensitive investigation should uncover the root cause behavioural and organisational risks that lead to Unknown Knowns so that leaders can fix at least the root causes before they can cause more harm. And as our research also shows, they usually do have some time.
An investigation may uncover things you wouldn't be happy to have written about on the front page of a national newspaper. If so, you should listen and learn; and be grateful for the opportunity to deal with them before they blow up and destroy your personal reputation as well as that of your organisation.
Anthony Fitzsimmons
Reputability LLP
London
www.reputability.co.uk
@Reputability
You can get a 20% discount when you buy Rethinking Reputational Risk - How to Manage the Risks that can Ruin Your Business, Your Reputation and You through this link by using the code RRRF20
About Me
- Reputability
- This blog carries a series of posts and articles, mostly written by Anthony Fitzsimmons under the aegis of Reputability LLP, a business that is no longer trading as such. Anthony is a thought leader in reputational risk and its root causes, behavioural, organisational and leadership risk. His book 'Rethinking Reputational Risk' was widely acclaimed. Led by Anthony, Reputability helped business leaders to find, understand and deal with these widespread but hidden risks that regularly cause reputational disasters. You can contact Anthony via the contact form.
Pages
Wednesday, 23 September 2015
Monday, 21 September 2015
The Silo Effect
Like our primate forebears, we humans have long organised ourselves into social groups and it is only natural that we form teams at work too. Trust, a common purpose, shared culture and social norms are likely to develop within the family, tribe, group or team, along with a sense of identity that defines who is an insider and who is not.
As organisations grow so do teams. The work of Robin Dunbar, an evolutionary psychologist and anthropologist suggests trouble starts as group size extends beyond about 150. As teams grow and multiply, so do team identities and purposes. Those in one team can easily come to see those in another as outsiders and rivals. Cooperation becomes more difficult as their interests increasingly conflict.
When we examine the entrails of crises, persistently asking the question “why?” we often find that the root causes were well known at mid-levels of the company. Sometimes the actual crisis was predictable, even predicted, from what one individual knew – but for a variety of reasons no message arrived in the consciousness of someone sufficiently senior to take action. Frequently, key information known at mid-levels was spread among individuals who did not share it - so the information was never joined up. We have dubbed this the 'Unknown Knowns' problem.
We analyse causes of failure such as these by reference to factors such as culture, incentives, structural silos and the resulting non-communication of information. Risk, psychology and sociology inform the analysis, but we have long suspected that anthropologists could enrich the analytical framework – if only they were interested in the business world.
As a postgraduate level anthropologist turned FT journalist, some of Gillian Tett’s most perceptive writing has taken an anthropological look at business life. Her latest book, ‘The Silo Effect’, explicitly brings her anthropological training to bear on it. As you would expect of an experienced journalist, it is engagingly written.
Tett begins by introducing anthropology and summarises a few core anthropological insights. Three are crucial:
But the outsider typically lacks crucial information that is available to an insider. Tett describes how anthropologists attempt to become ‘insider-outsiders’ with access to inside information whilst retaining the relative objectivity of the outsider. It is no accident that our methodology has much in common with what she describes. We face the same challenges: except that we also aim to help insiders to understand what outsiders can see when given access to insiders’ knowledge.
The balance of the book consists of case studies, written in Tett’s usual lucid style. Two of her studies of failure are built on her extensive knowledge of the financial crash of Noughties. She dissects how UBS, the Bank of England and the host of financial market regulators, experts and economists managed not to see the crash coming. A third tells how Sony, then a world-leader, reorganised itself into a series of separate business units, each with its own objectives. By creating what became silos, Sony lost internal cooperation and its way.
Tett tentatively develops her theme to suggest an anthropological approach to mastering silos, from the outside as well as from within. She begins by describing how, with advice from Robin Dunbar, Facebook has set out to build structural bridges of friendship and trust between what might become silos; and a culture that encourages experiments and cooperation across what might be frontiers in a culture that treats mistakes as opportunities to learn.
Tett’s second, contrasting tale tackles breaking down long-established silos. Her story concerns one of the most tribally structured professions: medicine. Structured around disciplines, there is a wasteful temptation for every doctor to try to apply their particular skill to your symptoms rather than beginning with an objective diagnosis and only then prescribing treatment perhaps by another doctor. Tett tells how a perceptive question led Toby Cosgrove, CEO of Ohio’s Cleveland Clinic to question and dismantle the Clinic's disciplinary silos to deliver care centred on the patient’s need for a dispassionate diagnosis before prescribing the most appropriate treatment.
But for me, Tett’s third tale was the most telling. She relates how a detached but interested outsider, a hedge fund, was able to deduce that JP Morgan’s Chief Investment Office was placing huge bets on credit derivatives – at a time when JP Morgan’s leaders and risk team were completely ignorant of what was going on under their noses, let alone the scale. The hedge fund, BlueMountain, profited from the insight when the London Whale breached. The episode cost JP Morgan more than $6 billion in losses on a series of holdings with a value Tett estimates at approaching $160 billion.
This story resonates with our experience. We regularly find that external analysis can identify organisations that seem blithely to be living on the edge of a cliff. As with real cliffs, it is rarely possible to predict when they will fail. But it is possible to predict why and with what consequences they will fail.
Leaders who seek what can be an uncomfortable foresight will usually have time to deal with the issues and avoid disgrace since consequences usually take time to emerge.
Astute long term investors can use such insights to avoid or improve vulnerable investments. And in those rare cases where the timing seems imminent, there may be opportunities to profit from another’s risk blindness.
Anthony Fitzsimmons
Reputability LLP
London
www.reputability.co.uk
Anthony Fitzsimmons is Chairman of Reputability LLP and, with the late Derek Atkins, author of “Rethinking Reputational Risk: How to Manage the Risks that can Ruin Your Business, Your Reputation and You”
As organisations grow so do teams. The work of Robin Dunbar, an evolutionary psychologist and anthropologist suggests trouble starts as group size extends beyond about 150. As teams grow and multiply, so do team identities and purposes. Those in one team can easily come to see those in another as outsiders and rivals. Cooperation becomes more difficult as their interests increasingly conflict.
When we examine the entrails of crises, persistently asking the question “why?” we often find that the root causes were well known at mid-levels of the company. Sometimes the actual crisis was predictable, even predicted, from what one individual knew – but for a variety of reasons no message arrived in the consciousness of someone sufficiently senior to take action. Frequently, key information known at mid-levels was spread among individuals who did not share it - so the information was never joined up. We have dubbed this the 'Unknown Knowns' problem.
We analyse causes of failure such as these by reference to factors such as culture, incentives, structural silos and the resulting non-communication of information. Risk, psychology and sociology inform the analysis, but we have long suspected that anthropologists could enrich the analytical framework – if only they were interested in the business world.
As a postgraduate level anthropologist turned FT journalist, some of Gillian Tett’s most perceptive writing has taken an anthropological look at business life. Her latest book, ‘The Silo Effect’, explicitly brings her anthropological training to bear on it. As you would expect of an experienced journalist, it is engagingly written.
Tett begins by introducing anthropology and summarises a few core anthropological insights. Three are crucial:
- Human groups develop ways of classifying and expressing thoughts, and these become embedded in their ways of thinking;
- These patterns help to entrench patterns of behaviour, often in a way that reinforces the status quo;
- These mental maps are partly recognised by group members but some parts are subliminal whilst others are ignored because they are thought “dull, taboo, obvious or impolite”, leaving some subjects beyond discussion.
But the outsider typically lacks crucial information that is available to an insider. Tett describes how anthropologists attempt to become ‘insider-outsiders’ with access to inside information whilst retaining the relative objectivity of the outsider. It is no accident that our methodology has much in common with what she describes. We face the same challenges: except that we also aim to help insiders to understand what outsiders can see when given access to insiders’ knowledge.
The balance of the book consists of case studies, written in Tett’s usual lucid style. Two of her studies of failure are built on her extensive knowledge of the financial crash of Noughties. She dissects how UBS, the Bank of England and the host of financial market regulators, experts and economists managed not to see the crash coming. A third tells how Sony, then a world-leader, reorganised itself into a series of separate business units, each with its own objectives. By creating what became silos, Sony lost internal cooperation and its way.
Tett tentatively develops her theme to suggest an anthropological approach to mastering silos, from the outside as well as from within. She begins by describing how, with advice from Robin Dunbar, Facebook has set out to build structural bridges of friendship and trust between what might become silos; and a culture that encourages experiments and cooperation across what might be frontiers in a culture that treats mistakes as opportunities to learn.
Tett’s second, contrasting tale tackles breaking down long-established silos. Her story concerns one of the most tribally structured professions: medicine. Structured around disciplines, there is a wasteful temptation for every doctor to try to apply their particular skill to your symptoms rather than beginning with an objective diagnosis and only then prescribing treatment perhaps by another doctor. Tett tells how a perceptive question led Toby Cosgrove, CEO of Ohio’s Cleveland Clinic to question and dismantle the Clinic's disciplinary silos to deliver care centred on the patient’s need for a dispassionate diagnosis before prescribing the most appropriate treatment.
But for me, Tett’s third tale was the most telling. She relates how a detached but interested outsider, a hedge fund, was able to deduce that JP Morgan’s Chief Investment Office was placing huge bets on credit derivatives – at a time when JP Morgan’s leaders and risk team were completely ignorant of what was going on under their noses, let alone the scale. The hedge fund, BlueMountain, profited from the insight when the London Whale breached. The episode cost JP Morgan more than $6 billion in losses on a series of holdings with a value Tett estimates at approaching $160 billion.
This story resonates with our experience. We regularly find that external analysis can identify organisations that seem blithely to be living on the edge of a cliff. As with real cliffs, it is rarely possible to predict when they will fail. But it is possible to predict why and with what consequences they will fail.
Leaders who seek what can be an uncomfortable foresight will usually have time to deal with the issues and avoid disgrace since consequences usually take time to emerge.
Astute long term investors can use such insights to avoid or improve vulnerable investments. And in those rare cases where the timing seems imminent, there may be opportunities to profit from another’s risk blindness.
Anthony Fitzsimmons
Reputability LLP
London
www.reputability.co.uk
Anthony Fitzsimmons is Chairman of Reputability LLP and, with the late Derek Atkins, author of “Rethinking Reputational Risk: How to Manage the Risks that can Ruin Your Business, Your Reputation and You”
Sunday, 13 September 2015
Admitting Mistakes Shows Intelligence
We are delighted that Professor John Kay has allowed us to reprint this column, on how admitting to doubts and mistakes shows intelligence and good leadership, not stupidity.
When I was much younger and editing an economics journal, I published an article by a distinguished professor — more distinguished, perhaps, for his policy pronouncements than his scholarship. At a late stage, I grew suspicious of some of the numbers in one of his tables and, on making my own calculations, found they were wrong. I rang him. Without apology, he suggested I insert the correct data. Did he, I tentatively enquired, wish to review the text and its conclusions in light of these corrections, or at least to see the amended table? No, he responded briskly.
The incident shocked me then: but I am wiser now. I have read some of the literature on confirmation bias: the tendency we all have to interpret evidence, whatever its nature, as demonstrating the validity of the views we already hold. And I have learnt that such bias is almost as common in academia as among the viewers of Fox News: the work of John Ioannidis has shown how few scientific studies can be replicated successfully. In my inexperience, I had foolishly attempted such replication before the article was published.
It is generally possible to predict what people will think about abortion from what they think about climate change, and vice versa; and those who are concerned about wealth inequality tend to favour gun control, while those who are not, do not. Why, since these seem wholly unrelated issues, should this be so? Opinions seem to be based more and more on what team you belong to and less and less on your assessment of facts.
But there are still some who valiantly struggle to form their own opinions on the basis of evidence. John Maynard Keynes is often quoted as saying: “When the facts change, I change my mind. What do you do, sir?” This seems a rather minimal standard of intellectual honesty, even if one no longer widely aspired to. As with many remarks attributed to the British economist, however, it does not appear to be what he actually said: the original source is Paul Samuelson (an American Nobel laureate, who cannot himself have heard it) and the reported remark is: “When my information changes, I alter my conclusions.”
There is a subtle, but important, difference between “the facts” and “my information”. The former refers to some objective change that is, or should be, apparent to all: the latter to the speaker’s knowledge of relevant facts. It requires greater intellectual magnanimity to acknowledge that additional information might imply a different conclusion to the same problem, than it does to acknowledge that different problems have different solutions.
But Keynes might have done better to say: “Even when the facts don’t change, I (sometimes) change my mind.” The history of his evolving thought reveals that, with the self-confidence appropriate to his polymathic intellect, he evidently felt no shame in doing so. As he really did say (in his obituary of another great economist, Alfred Marshall, whom he suggests was reluctant to acknowledge error): “There is no harm in being sometimes wrong — especially if one is promptly found out.”
To admit doubt, to recognise that one may sometimes be wrong, is a mark not of stupidity but of intelligence. A higher form of intellectual achievement still is that described by F Scott Fitzgerald: “The test of a first-rate intelligence,” he wrote, “is the ability to hold two opposed ideas in the mind at the same time and still retain the ability to function.”
The capacity to act while recognising the limits of one’s knowledge is an essential, but rare, characteristic of the effective political or business leader. “Some people are more certain of everything than I am of anything,” wrote former US Treasury secretary (and Goldman Sachs and Citigroup executive) Robert Rubin. We can imagine which politicians he meant.
First published in the Financial Times.
© John Kay 2015 http://www.johnkay.com
When I was much younger and editing an economics journal, I published an article by a distinguished professor — more distinguished, perhaps, for his policy pronouncements than his scholarship. At a late stage, I grew suspicious of some of the numbers in one of his tables and, on making my own calculations, found they were wrong. I rang him. Without apology, he suggested I insert the correct data. Did he, I tentatively enquired, wish to review the text and its conclusions in light of these corrections, or at least to see the amended table? No, he responded briskly.
The incident shocked me then: but I am wiser now. I have read some of the literature on confirmation bias: the tendency we all have to interpret evidence, whatever its nature, as demonstrating the validity of the views we already hold. And I have learnt that such bias is almost as common in academia as among the viewers of Fox News: the work of John Ioannidis has shown how few scientific studies can be replicated successfully. In my inexperience, I had foolishly attempted such replication before the article was published.
It is generally possible to predict what people will think about abortion from what they think about climate change, and vice versa; and those who are concerned about wealth inequality tend to favour gun control, while those who are not, do not. Why, since these seem wholly unrelated issues, should this be so? Opinions seem to be based more and more on what team you belong to and less and less on your assessment of facts.
But there are still some who valiantly struggle to form their own opinions on the basis of evidence. John Maynard Keynes is often quoted as saying: “When the facts change, I change my mind. What do you do, sir?” This seems a rather minimal standard of intellectual honesty, even if one no longer widely aspired to. As with many remarks attributed to the British economist, however, it does not appear to be what he actually said: the original source is Paul Samuelson (an American Nobel laureate, who cannot himself have heard it) and the reported remark is: “When my information changes, I alter my conclusions.”
There is a subtle, but important, difference between “the facts” and “my information”. The former refers to some objective change that is, or should be, apparent to all: the latter to the speaker’s knowledge of relevant facts. It requires greater intellectual magnanimity to acknowledge that additional information might imply a different conclusion to the same problem, than it does to acknowledge that different problems have different solutions.
But Keynes might have done better to say: “Even when the facts don’t change, I (sometimes) change my mind.” The history of his evolving thought reveals that, with the self-confidence appropriate to his polymathic intellect, he evidently felt no shame in doing so. As he really did say (in his obituary of another great economist, Alfred Marshall, whom he suggests was reluctant to acknowledge error): “There is no harm in being sometimes wrong — especially if one is promptly found out.”
To admit doubt, to recognise that one may sometimes be wrong, is a mark not of stupidity but of intelligence. A higher form of intellectual achievement still is that described by F Scott Fitzgerald: “The test of a first-rate intelligence,” he wrote, “is the ability to hold two opposed ideas in the mind at the same time and still retain the ability to function.”
The capacity to act while recognising the limits of one’s knowledge is an essential, but rare, characteristic of the effective political or business leader. “Some people are more certain of everything than I am of anything,” wrote former US Treasury secretary (and Goldman Sachs and Citigroup executive) Robert Rubin. We can imagine which politicians he meant.
First published in the Financial Times.
© John Kay 2015 http://www.johnkay.com
Subscribe to:
Posts (Atom)