The Ego Effect: The Science Behind Why We Keep Making the Same Mistakes Over and Over Again.

Mayo,

In 1954, Leon Festinger—one of the most influential psychologists of all-time—came across a strange headline written on the back page of his local Lake City Herald newspaper, which read: ‘Prophecy from Planet Clarion Call to City: Flee That Flood.’

The body of the story described a housewife named Marian Keech (real name: Dorothy Martin), who claimed that she had received a prophecy from “superior beings from a planet called Clarion,” who told her the world would be destroyed by a flood on 21 December 1954.

Up until that day, Keech had convinced some of her friends and a small band of people, to quit their jobs, relieve themselves of possessions, and await the spaceship that would pick them up at the back garden of her small house in Michigan at midnight, and save them from the apocalypse.

Festinger wanted to dig deeper into the psychology of the cult, and glean some insight into a fascinating question: how would they react if the prophecy failed?

So, Festinger and three of his researchers joined the group and began to study the cult members.

By the morning of 21 December 1954, the cult members had congregated in Keech’s house, and removed all metal from their persons in order to safely board the incoming saucer—including zippers, metal clasps, buttons with metal backing, bobby pins, and belt buckles.

They were on a “twenty-four-hour alert,” eagerly anticipating the spaceship that would pick them up at any time.

By 5.30 pm, the cult members began to drum up explanations as to why the spaceship hadn’t arrived: “the saucers would indeed land when the time was ripe,” they said.

At 11.30 pm, Keech claimed to receive a psychic message that the flying saucer was on its way to pick up the chosen ones, so the group stood outside in freezing cold and snow, and shivered for hours.

At about 2am the next day, the exhausted and nearly frozen group gave up on waiting for the saucer and went to bed.

Shortly afterwards, the cult members—crushed and defeated—would disband, give up on their beliefs and move on with their lives.

Or would they?

Image of Dorothy Martin (aka “Marian Keech”), the cult leader in Leon Festinger’s cult group study (Image Credit: Charles E. Knoblock)

Plane Crashes, Medical Errors and Stupid Experts

“Learn from the mistakes of others. You can’t live long enough to make them all yourself.”

— Eleanor Roosevelt

In his classic book, When Prophecy Fails, Festinger describes the fascinating behavioral changes among the cult members after the prophecy of the apocalypse failed to materialize.

Festinger observed that prior to the failed prophecy, the group avoided the press at all costs, swore to secrecy, and took pride in not persuading people to join the cult.

Immediately after the failed prophecy however, the group avidly sought after the press and publicity, shared their innermost secrets to the world, attempted to attract new members through persuasion, and began to make prediction after prediction in hopes of one coming true.

In essence, the group altered the conflicting evidence to reinforce their beliefs, and they became even more committed to the cult than they had ever been before. As Keech said, “it was this little group spreading light here that prevented the flood.”

At first glance it’s easy to ridicule the beliefs and behaviors of the cult members as superstitious and stubborn. But in 1957, Festinger put together his findings in the groundbreaking paper ‘A Theory of Cognitive Dissonance,’ which exposed our psychological tendencies to reframe conflicting evidences in support of our deeply held beliefs, instead of changing our beliefs. 6

Rather than own up to our mistakes and learn from them, we tend to invent new explanations as to why the mistake occurred, or ignore the conflicting evidence altogether.

This behavioral tendency is especially prevalent in health care. For example, in her book After Harm, Nancy Berlinger, a health researcher, investigated how doctors typically reframe mistakes and noted that, “observing more senior physicians, students learn that their mentors and supervisors believe in, practise and reward the concealment of errors.”

She also said, “They learn how to talk about unanticipated outcomes until a ‘mistake’ morphs into a ‘complication’. Above all, they learn not to tell the patient anything.”

This behavior is partly due to fear of litigation, but more so the ego of clinicians: after spending the better part of a decade training under a long, tedious and expensive medical education, they expect themselves to avoid making mistakes and perfectly execute procedures.

When errors occur, the cognitive dissonance causes most doctors to avoid reporting the mistake. For example, a European medical study discovered that only 32 percent of doctors reported errors, despite 70 percent of doctors noting that it’s the right thing to do. 2

In essence, the culture within health care protects the ego of clinicians through the avoidance and concealment of failure, which breeds repetitive mistakes and needless deaths.

A study published in the Journal of Patient Safety, estimated that more than 400,000 deaths each year in the United States, are associated with preventable harm in hospitals. 3

In contrast, over the past century, the aviation industry has pushed past its ego, exposed its failures and learnt from its mistakes—saving millions of lives in the process.

When Captain Chesley Sullenberger safely—and miraculously—landed the US Airways Flight 1549 on the Hudson river in 2009, he was praised as a hero and a skillful pilot. But Sullenberger said otherwise:

“Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died . . . We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.”

A further testament to the aviation industry’s commitment to learning from failure, is the use of “black boxes.” 4

Each aircraft is equipped with two, nearly indestructible “black boxes,” one which records conversations in the cockpit and the other which records instructions sent to all on-board electronic systems.

When an accident occurs, independent investigators open the black box, analyse the data and uncover the root causes of the accident, to ensure that the same mistake doesn’t happen again.

Furthermore, the investigation reports are shared publicly and airlines are held accountable to make recommended changes.

The rewards of this progressive attitude towards failure are evident: In 1912, eight out of fourteen US Army pilots died from plane crashes, whereas in 2013, only 210 people died out of 36.3 million commercial flights carrying 3 billion passengers. 5

This is a stark contrast to the repetitive mistakes that have caused millions of preventable deaths in health care.

Why has aviation significantly outperformed health care in error prevention, despite centuries worth of a head start in the medical field?

The answer to this puzzle can be explained by what we can call “The Ego Effect.”

In essence, The Ego Effect suggests that you’re prone to making the same mistakes over and over again, when you protect your beliefs instead of learning from your mistakes and changing your beliefs in response to conflicting evidence.

This is why Keech and her cult members resolved to wait for the flying saucers, despite the failed prophecy. It’s why experts and pundits are the most likely to make the worst predictions and cover it up with follow-up explanations as to why they were right all along.

And it’s why smart people, like doctors and judges, make stupid decisions.

It’s Okay to Be Wrong

In today’s media-driven culture, failures and mistakes are not only displayed to the public eye—leading to criticism and reputational damage—but their also stigmatized within the workplace, which stifles smart decision-making and innovation.

As a result, we tend to fall prey to The Ego Effect and the cognitive biases that lead us to keep making the same mistakes over and over again.

Instead, we can take a page out of the book of the aviation industry: swallow our pride, admit our mistakes and put systems in place to learn from them.

Ultimately, the people and organizations that embrace failure and create a strong culture around learning from their mistakes, will thrive in our increasingly complex and uncertain world.


FOOTNOTES

  1. Festinger, L. (1957). A Theory of Cognitive Dissonance. California: Stanford University Press.
  2. J. L. Vincent, ‘Information in the ICU: Are we being honest with patients? The results of a European questionnaire’, Intensive Care Medicine, 1998; 24(12): 1251–6.
  3. James, John. (2013). A New, Evidence-Based Estimate of Patient Harms Associated with Hospital Care. Journal of patient safety. 9. 10.1097/PTS.0b013e3182948a69.
  4. The term black box is a misnomer. The official term for the device is the Flight Data Recorder, and it is in fact bright orange, not black.
  5. https://www.skybrary.aero/bookshelf/books/2710.pdf