Black Box Thinking

[Please note: The word “Featured” on the left side above was NOT inserted by this blogger, but apparently was inserted by WordPress, and it cannot be removed. NO post in this blog is sponsored.]

The Book of the Week is “Black Box Thinking, Why Most People Never Learn From Their Mistakes– But Some Do” by Matthew Syed, published in 2015. This volume attempted to answer the question: “How does failure-denial become so deeply entrenched in human minds and systems?”

The author described two ways of thinking:

1. Some people believe their abilities are fixed, so they won’t improve with practice. They have fear of failure, and make excuses and / or blame others for their failures.

2. Other people believe they can get better with practice, and they are honest about admitting they have made errors. They learn from them. Success is achieved only through trial and error, hard work and persistence.

Number 1 above is also described in the following quote from Bertrand Russell: “There is something feeble and a little contemptible about a man who cannot face the perils of life without the help of comfortable myths. Almost inevitably some part of him is aware that they are myths and that he believes them only because they are comforting. But he dare not face this thought! Moreover, since he is aware, however dimly, that his opinions are not rational, he becomes furious when they are disputed.” Yet another way of putting it is “hubris syndrome.”

Two of America’s recent presidents– George W. Bush and Donald Trump– were this kind of thinker. According to the author’s thesis, they succeeded against the odds (if success is defined as getting elected president), considering that they were blind to their own character flaws.

BUT– their common beginnings saw them through: They both began with the special advantages of inheriting money, mentors, lawyers, and valuable career and political contacts. They proceeded to fail upwards until they reached their peak “Peter principle” level, kind of like the joke: How do you make a small fortune in Israel? Answer: Come with a large one.

The author drew parallels between the topic-areas of aviation and healthcare delivery. These involve life-and-death scenarios when things go extremely wrong. However, that is where the similarities stop. People who have shaped the evolution of aviation have built up a knowledge-base that has served to produce lower and lower death tolls when catastrophes have occurred; powerful, influential people working in healthcare have been stubbornly resistant to adopting measures that would result in a drastic reduction in unnecessary deaths.

The author cited real-life examples from Great Britain and the United States. But there are other major reasons why his comparison is mostly invalid. These involve lawsuits, unions, government regulations and the political climate at the time of the disasters, and the following:

Obviously, workers in aviation have more of an incentive to improve safety, because in a disaster, many more people might die all at once in a plane crash, compared to the one patient on an operating table or examination table. Even if members of the flight crew survive a disaster, their careers are likely over. Even when doctors are at fault, they usually continue their careers.

The author discussed the pros and cons of just-culture versus blame-culture. He described the latter thusly: “It may be intellectually satisfying to have a culprit, someone to hang their disaster on. And it certainly makes life simple.”

The author recounted how a public-relations campaign can fool even intelligent people into believing a particular method of crime-prevention among young people, works wonders. The only way to debunk such a myth is through numerous Randomized Control Trials.

Read the book to learn about additional concepts surrounding psychological self-deceptions that humans employ in order to avoid admitting failures: cognitive dissonance, narrative fallacy, top-down versus bottom-up product development, various biases, and others.