Our brains are inherently lazy—they rely on automatic processes as much as possible.
And for good reason. Imagine if you had to stop and think every time you walked, or got on a bicycle, or drove your car, or logged onto your computer. The examples of situations where our brains are on “auto-pilot” are plentiful. We don’t have to spend precious time to think—we just react.
And this can actually be life saving at times. If a car suddenly cuts in front of you while you are on the freeway, you automatically put on the brakes and/or swerve to avoid hitting it. Your body responds before your brain has time to think through the options. We have many learned responses that allow us to be more efficient and often, more effective. In medicine, mental heuristics are useful in solving what can be challenging diagnostic dilemmas.
There is a dark side to these mental short-cuts, however. They can lead to potentially severe and catastrophic decision errors. A recent literature review determined that over a third to two-thirds of diagnostic inaccuracies were attributable to cognitive biases by physicians. In “The importance of cognitive errors in diagnosis and strategies to minimize them,” published in Academic Medicine, Pat Croskerry notes that internal medicine, family practice, and emergency medicine are specialties more prone to delayed, missed, or erroneous diagnostic decision-making. Overconfidence has been identified as a particularly common form of cognitive bias implicated in diagnostic errors. Physicians are not alone in their use of cognitive biases—everyone uses them and they show up across professions and industries.
Why are cognitive errors so prevalent in people? The prefrontal cortex (PFC) of our brains allows us to engage in complex cognitive processes, decision-making, emotional regulation, and pro-social behaviors. This region of the brain is an energy hog; for its size, it uses a relatively large amount of energy compared to other body organs. And once the PFC depletes available energy (oxygen and glucose) it just does not work well. So it engages in energy-saving strategies; the primary one being automaticity—the unconscious use of cognitive shortcuts. While this may help in saving energy, it leads to numerous cognitive errors, referred to as “biases” and “heuristics.”
The overconfidence bias
We will be exploring some of these errors in greater detail over the next several articles. Today, our focus is on the “overconfidence bias,” what experts in the field of cognitive heuristics—such as Danial Kahneman in “Thinking Fast and Slow”—consider as the most significant and most ubiquitous of all biases. This form of cognitive error is a tendency towards being too certain. It is the propensity to overestimate our ability when it is not objectively reasonable. A common example is that most people, when asked, estimate that their driving ability is significantly better than the mean—of course this is impossible. One frequent manner where overconfidence shows up in medicine relates to diagnostic accuracy. When a physician is certain about a diagnosis, he/she is less likely to order additional diagnostic studies, or to consider other forms of treatment, or to ask more probing questions.
The overconfidence bias (like all cognitive errors) allows us to make decisions quickly and causes us to avoid critical thinking. Often overconfidence is the secondary error. Another bias, such as the confirmation bias (the tendency to search selectively for evidence that confirms our beliefs) may be at the root. Then overconfidence in one’s knowledge leads to the error in diagnosis. Because these types of errors occur automatically and unconsciously (most of the time), it presents a significant challenge because if we don’t know we are doing it, how can we fix it? The answer to this question is to employ strategies that bring our attention to its presence.
“Debiasing” refers to strategies aimed at mitigating the impact of mental errors and its use has gained some attention in the medical literature (e.g., Cassam). Know that whenever we are under stress, it takes a toll on the PFC, causing us to rely even more on automatic processing, which of course leads to cognitive errors. That’s when it becomes particularly important to practice debiasing strategies.
Here are a few you can use to address overconfidence:
1. Recognize that medicine often lends itself to overconfidence—not knowing or being uncertain is not always valued, by the profession and/or by patients. However, acknowledging uncertainty coupled with strategies to gain greater certainty (such as further diagnostic assessments, asking more questions, consulting with a colleague, and/or reviewing the literature) go a long way toward increasing patient confidence and appreciation for their physician.
2. Recognize that overconfidence is a common cognitive error and that the more you are aware of its existence, the more you can do to mitigate its potentially detrimental effects.
3. Be aware that you don’t know what you don’t know. Practice humility and an openness to learning. On a regular basis, ask yourself more questions. For example: What might I be missing? What else could this set of symptoms mean? What else could I ask the patient to gain more understanding?
The next article in this series will explore other cognitive errors and add to our list of “debiasing” strategies.
Catherine Hambley, PhD, is a strategic and innovative business partner working with organizations, teams, managers, and senior leaders to promote their effectiveness and success.