https://immattersacp.org/archives/2008/01/groopman.htm

Mindful Medicine: Critical thinking leads to right diagnosis

Jerome Groopman, FACP, author of the bestselling “How Doctors Think,” and his wife, endocrinologist Pamela Hartzband, ACP Member, discuss the art of medical diagnosis and decision making through a series of case studies suggested by readers.


Modern clinical practice has successfully integrated new sciences over the past decades. Molecular biology, particularly DNA analysis, is now routine practice in characterizing the genesis of different diseases and, in some cases, targeting specific treatments. Mechanical engineering in the form of robotics has greatly assisted surgeons and other interventionists in performing delicate procedures where accuracy beyond that of the human hand is needed. High-performance computing has opened up the field of bioinformatics as well as assisted in the resolution of images in MRI and other scanning. Yet, modern medicine has largely neglected an important, emerging and highly relevant science—that of cognition.

Throughout our training in internal medicine, our specialty fellowships, and our subsequent roles as medical educators, there was little focus on the thinking processes that can lead clinicians away from a correct diagnosis. Yet, many medical errors are triggered by cognitive missteps. Although it is only an estimate, several studies in the medical literature indicate that misdiagnosis occurs in some 15% to 20% of all cases, and that about 80% of these are characterized as due to cognitive errors. In half of all misdiagnoses, there is serious harm to the patient.

In this column, we will introduce the vocabulary of the emerging field of cognitive science as it applies to clinical medicine. We will outline the major types of thinking errors, and then show how they can be applied to cases of misdiagnosis and misguided care. It is our hope that incorporating this new knowledge about how our minds work will help physicians to make more accurate diagnoses and offer the most effective treatments.

As our practices become more technology based, there is a sense that perhaps advanced radiology techniques, computers and other technical aids could reduce our frequency of thinking mistakes. Interestingly, one study showed that the advent of CT scans did not appreciably change the rate of misdiagnosis, and in fact, sometimes the technology contributed to diagnostic error. While electronic medical records and other computer-based systems are certainly useful, to date, studies have not shown a robust benefit with regard to reducing the rate of misdiagnosis.

We readily admit to being traditionalists; we believe that a modern-day clinician can benefit from the fruits of technology, but that this technology will not replace his or her knowledge, experience and critical thinking—in short, the resources of his or her mind. Importantly, one of those resources should be what is termed “meta-cognition,” the ability to think about thinking, an understanding of how information can be misinterpreted or misleading, our susceptibility to biases and cognitive pitfalls.

For this first column, we offer a brief narrative of one of our own misdiagnoses to demonstrate some of the most common thinking traps and to introduce the vocabulary of cognitive science. We aim to make future columns more dynamic and interactive by inviting our readers to submit their own case histories and have us view these narratives through the lens of cognitive science. Readers should submit a case history of no more than 500 words to us via ACP Internist. Selected cases will form the basis of future columns, starting with the March issue of ACP Internist and continuing every other issue thereafter throughout 2008. Depending on the number of submissions, we may not be able to respond to every physician individually, but we hope that the cases we select will hold lessons for all readers.

Case study: A resident's mistake

A resident in medicine cared for an elderly woman who complained of discomfort under her sternum. A physical examination showed no abnormalities, and routine blood work, including a CBC, electrolytes, liver function tests, and BUN and creatinine, were normal. Her chest X-ray was read as unremarkable. The resident told the woman that her symptom likely was caused by acid reflux, and prescribed antacids. The treatment afforded limited relief, and she continued to complain. When he saw her in follow-up in clinic, he found no change in her physical examination and told her to continue the treatment. Over the course of several weeks, she continued to complain, and her voice started to sound like a nail scratching a chalkboard. The resident advised her to continue taking antacids and told her she would get better.

Some weeks later, the resident was urgently paged to the ER. The woman had excruciating chest pain and was in shock. She proved to have a dissecting aortic aneurysm. The young physician's mentors told him that making the diagnosis could be difficult and that few patients her age survive the extensive surgery. Their words provided scant comfort.

In this case, the resident was one of us (Dr. Groopman). We have frequently discussed this case and other of our misdiagnoses, but it was only recently, when we began to delve into cognitive science, that we learned the underlying thinking errors that led to this kind of serious misdiagnosis.

Making a correct diagnosis involves arranging the information from patient symptoms, signs, and laboratory findings into a pattern, and superimposing this pattern onto a template of a typical case in the doctor's mind. Medical textbooks and evidence-based protocols provide clinicians with an important starting point for analyzing symptoms and determining the likely diagnosis in typical cases. But we don't always arrive at the correct diagnosis using these tools. Why not?

There are a number of factors that make pattern recognition difficult. There may be incomplete or misleading information. Cases are not always “typical.” Most importantly, how we select the clinical elements, weigh their importance and arrange them can result in several different patterns, all leading to very different diagnoses.

Amos Tversky, PhD, and Daniel Kahneman, PhD, experimental psychologists working at Hebrew University in Jerusalem some three decades ago, studied how thinking occurs under conditions of uncertainty and time pressure. Drs. Tversky and Kahneman, who pioneered the field of behavioral economics, developed “prospect theory,” which argues that human errors in judgment can be categorized and predicted. Although the cognitive pitfalls they identified were found largely by studying students under experimental conditions, these thinking traps provide a framework to understand clinical mistakes in real-world practice. Three common decision-making heuristics—anchoring, availability and attribution—were identified by Drs. Tversky and Kahneman as well as other psychologists who built upon their theories. These principles can be applied to the process of formulating a diagnosis:

  • Anchoring refers to the tendency to seize on the first symptom, physical finding or laboratory abnormality and “anchor” one's mind onto an answer quickly. Although such snap judgments may prove correct, they can also lead us astray. Vague discomfort under the sternum may often reflect acid reflux. Of course, it is not the only explanation of this symptom. In the case described, the resident's mind was anchored on this diagnosis and would not detach.
  • Availability refers to the tendency to assume that an easily remembered prior experience explains the new situation you are facing. As residents, we see many patients with vague substernal complaints that reflect acid reflux. This diagnosis was familiar and “available” in a young physician's mind.
  • Attribution refers to the tendency to invoke stereotypes in our minds and “attribute” symptoms and findings to the stereotype, which often is negative. In this case, the young physician viewed the older woman as something of a complainer, even a hypochondriac, so that her persistent symptom was not taken very seriously.

In retrospect, it seems difficult to imagine that a resident physician could not unhinge from his first impression and explore more widely other causes for the patient's problem. Looking at the narrative on paper, of course, is a world away from being a harried young doctor in training. By applying the vocabulary and principles of cognitive science, we can unravel the genesis of this and other types of thinking errors and discover why and how they occur. Further, understanding these errors helps us to prevent future errors.

There are many types of cognitive pitfalls. The three listed above are among the most frequent and, in real-world practice, often occur in concert. While they were defined by psychologists under experimental conditions individually, when they occur together they reinforce each other. Future columns will refer back to these three cardinal errors and explore other thinking traps the clinician should be aware of and work to avoid.