Article Type
Changed
Tue, 12/08/2020 - 10:11

While it’s important not to think immediately of zebras when hearing hoofbeats, it’s just as important not to assume it’s always a horse. The delicate balance between not jumping to the seemingly obvious diagnosis without overanalyzing and overtesting is familiar to all physicians, and it’s far easier to avoid diagnostic mistakes when you understand the cognitive biases that can lead doctors astray.

Doctor sitting at her desk
sturti/Getty Images

“When these errors are made, it’s not because physicians lack knowledge, but they go down a wrong path in their thinking process,” Richard Scarfone, MD, a pediatric emergency medicine physician at the Children’s Hospital of Philadelphia, told attendees at the annual meeting of the American Academy of Pediatrics, held virtually this year. “An important point to be made here is that how physicians think seems to be much more important than what physicians know.”

Dr. Scarfone and Joshua Nagler, MD, MHPEd, director, pediatric emergency medicine fellowship program at Children’s Hospital Boston, presented a session on the cognitive biases that can trip up clinicians when making diagnoses and how to avoid them. Research shows that the rate of diagnostic error is approximately 15%. Although those findings come from studies in adults, the rates are likely similar in pediatrics, Dr. Scarfone said.

A wide range of clinical factors contribute to diagnostic errors: limited information, vague or undifferentiated symptoms, incomplete history, multiple transitions of care, diagnostic uncertainty, daily decision density, and reliance on pattern recognition, among others. Personal contributing factors can play a role as well, such as atypical work hours, fatigue, one’s emotional or affective state, a high cognitive load, and others. On top of all that, medical decision-making can be really complex on its own, Dr. Scarfone said. He compared differential diagnosis with a tree where a single leaf is the correct diagnosis.
 

System 1 thinking: Pros and cons

Dr. Scarfone and Dr. Nagler explained system 1 and system 2 thinking, two different ways of thinking that can influence decision-making that Daniel Kahneman explained in his book “Thinking, Fast and Slow.” System 1 refers to the snap judgments that rely on heuristics while system 2 refers to a more analytic, slower process.

gzorgz/iStock/Getty Images

Neither system 1 nor 2 is inherently “right or wrong,” Dr. Scarfone said. “The diagnostic sweet spot is to try to apply the correct system to the correct patient.”

Heuristics are the mental shortcuts people use to make decisions based on past experience. They exist because they’re useful, enabling people to focus only on what they need to accomplish everyday tasks, such as driving or brushing teeth. But heuristics can also lead to predictable cognitive errors.

“The good news about heuristics and system 1 thinking is that it’s efficient and simple, and we desire that in a busy practice or ED setting, but we should recognize that the trade-off is that it may be at the expense of accuracy,” Dr. Scarfone said.

The advantage to system 1 thinking is easy, simple, rapid, and efficient decision-making that rejects ambiguity. It’s also usually accurate, which rewards the approach, and accuracy increases with time based on memory, experience, and pattern recognition. Doctors develop “illness scripts” that help in identifying diagnoses.

“Illness scripts are common patterns of clinical presentations that usually lead us to a diagnostic possibility,” Dr. Scarfone said. “A classic illness script might be a 4-week-old firstborn male with forceful vomiting, and immediately your mind may go to pyloric stenosis as a likely diagnosis.” But the patient may have a different diagnosis than the initial impression your system 1 thinking leads you to believe.

“Generally, the more experience a clinician has, the more accurate they’ll be in using system 1,” he said. “Seasoned physicians are much more likely to employ system 1 than a newer physician or trainee,” which is why heuristics shouldn’t be thought of as hindrances. Dr. Scarfone quoted Kevin Eva in a 2005 review on clinical reasoning: “Successful heuristics should be embraced rather than overcome.”

A drawback to system 1 thinking, however, is thinking that “what you see is all there is,” which can lead to cognitive errors. Feeling wrong feels the same as feeling right, so you may not realize when you’re off target and therefore neglect to consider alternatives.

“When we learn a little about our patient’s complaint, it’s easier to fit everything into a coherent explanation,” Dr. Scarfone said, but “don’t ask, don’t tell doesn’t work in medicine.”

Another challenge with system 1 thinking is that pattern recognition can be unreliable because it’s dependent on context. For example, consider the difference in assessing a patient’s sore throat in a primary care office versus a resuscitation bay. “Clearly our consideration of what may be going on with the patient and what the diagnosis may be is likely to vary in those two settings,” he said.
 

 

 

System 2 thinking: Of zebras and horses

System 2 is the analytic thinking that involves pondering and seek out the optimal answer rather than the “good-enough” answer.

“The good news about system 2 is that it really can monitor system 1,” said Dr. Nagler, who has a master’s degree in health professions education. “If you spend the time to do analytic reasoning, you can actually mitigate some of those errors that may occur from intuitive judgments from system 1 thinking. System 2 spends the time to say ‘let’s make sure we’re doing this right.’ ” In multiple-choice tests, for example, people are twice as likely to change a wrong answer to a right one than a right one to a wrong one.

System 2 thinking allows for the reasoning to assess questions in the gray zone. It’s vigilant, it’s reliable, it’s effective, it acknowledges uncertainty and doubt, it can be safe in terms of providing care, and it has high scientific rigor. But it also has disadvantages, starting with the fact that it’s slower and more time-consuming. System 2 thinking is resource intensive, requiring a higher cognitive demand and more time and effort.

“Sometimes the quick judgment is the best judgment,” Dr. Nagler said. System 2 thinking also is sometimes unnecessary and counter to value-based care. “If you start to think about all the possibilities of what a presentation may be, all of a sudden you might find yourself wanting to do all kinds of tests and all kinds of referrals and other things, which is not necessarily value-based care.” When system 2 thinking goes astray, it makes us think everything we see is a zebra rather than a horse.

Sonia Khan, MD, a pediatrician in Fremont, Calif., found this session particularly worthwhile.

“It really tries to explain the difference between leaping to conclusions and learning how to hold your horses and do a bit more, to double check that you’re not locking everything into a horse stall and missing a zebra, and avoiding go too far with system 2 and thinking that everything’s a zebra,” Dr. Khan said. “It’s a difficult talk to have because you’re asking pediatricians to look in the mirror and own up, to learn to step back and reconsider the picture, and consider the biases that may come into your decision-making; then learn to extrude them, and rethink the case to be sure your knee-jerk diagnostic response is correct.”
 

Types of cognitive errors

The presenters listed some of the most common cognitive errors, although their list is far from exhaustive.

  • Affective error. Avoiding unpleasant but necessary tests or examinations because of sympathy for the patient, such as avoiding blood work to spare a needle stick in a cancer patient with abdominal pain because the mother is convinced it’s constipation from opioids. This is similar to omission bias, which places excessive concern on avoiding a therapy’s adverse effects when the therapy could be highly effective.
  • Anchoring. Clinging to an initial impression or salient features of initial presentation, even as conflicting and contradictory data accumulate, such as diagnosing a patient with fever and vomiting with gastroenteritis even when the patient has an oxygen saturation of 94% and tachypnea.
  • Attribution errors. Negative stereotypes lead clinicians to ignore or minimize the possibility of serious disease, such as evaluating a confused teen covered in piercings and tattoos for drug ingestion when the actual diagnosis is new-onset diabetic ketoacidosis.
  • Availability bias. Overestimating or underestimating the probability of disease because of recent experience, what was most recently “available” to your brain cognitively, such as getting head imaging on several vomiting patients in a row because you recently had one with a new brain tumor diagnosis.
  • Bandwagon effect. Accepting the group’s opinion without assessing a clinical situation yourself, such as sending home a crying, vomiting infant with a presumed viral infection only to see the infant return later with intussusception.
  • Base rate neglect. Ignoring the true prevalence of disease by either inflating it or reducing it, such as searching for cardiac disease in all pediatric patients with chest pain.
  • Commission. A tendency toward action with the belief that harm may only be prevented by action, such as ordering every possible test for a patient with fever to “rule everything out.”
  • Confirmation bias. Subconscious cherry-picking: A tendency to look for, notice, and remember information that fits with preexisting expectations while disregarding information that contradicts those expectations.
  • Diagnostic momentum. Clinging to that initial diagnostic impression that may have been generated by others, which is particularly common during transitions of care.
  • Premature closure. Narrowing down to a diagnosis without thinking about other diagnoses or asking enough questions about other symptoms that may have opened up other diagnostic possibilities.
  • Representation bias. Making a decision in the absence of appropriate context by incorrectly comparing two situations because of a perceived similarity between them, or on the flip side, evaluating a situation without comparing it with other situations.
  • Overconfidence. Making a decision without enough supportive evidence yet feeling confident about the diagnosis.
  • Search satisfying. Stopping the search for additional diagnoses after the anticipated diagnosis has been made.
 

 

Cognitive pills for cognitive ills

Being aware of the pitfalls of cognitive errors is the first step to avoiding and mitigating them. “It really does start with preparation and awareness,” Dr. Scarfone said before presenting strategies to build a cognitive “firewall” that can help physicians practice reflectively instead of reflexively.

First, be aware of your cognitive style. People usually have the same thinking pattern in everyday life as in the clinical setting, so determine whether you’re more of a system 1 or system 2 thinker. System 1 thinkers need to watch out for framing (relying too heavily on context), premature closure, diagnostic momentum, anchoring, and confirmation bias. System 2 thinkers need to watch out for commission, availability bias, and base rate neglect.

“Neither system is inherently right or wrong,” Dr. Scarfone reiterated. “In the perfect world, you may use system 1 to form an initial impression, but then system 2 should really act as a check and balance system to cause you to reflect on your initial diagnostic impressions.”

Additional strategies include being a good history taker and performing a meticulous physical exam: be a good listener, clarify unclear aspects of the history, and identify and address the main concern.

“Remember children and families have a story to tell, and if we listen carefully enough, the diagnostic clues are there,” Dr. Scarfone said. “Sometimes they may be quite subtle.” He recommended doctors perform each part of the physical exam as if expecting an abnormality.

Another strategy is using meta-cognition, a forced analysis of the thinking that led to a diagnosis. It involves asking: “If I had to explain my medical decision-making to others, would this make inherent sense?” Dr. Scarfone said. “If you’re testing, try to avoid anchoring and confirmation biases.”

Finally, take a diagnostic time-out with a checklist that asks these questions:

  • Does my presumptive diagnosis make sense?
  • What evidence supports or refutes it?
  • Did I arrive at it via cognitive biases?
  • Are there other diagnostic possibilities that should be considered?

One way to do this is creating a table listing the complaint/finding, diagnostic possibilities with system 1 thinking, diagnostic possibilities with system 2 thinking, and then going beyond system 2 – the potential zebras – when even system 2 diagnostic possibilities don’t account for what the patient is saying or what the exam shows.

Enough overlap exists between these cognitive biases and the intrinsic bias related to individual characteristics that Dr. Khan appreciated the talk on another level as well.

“For me, as a brown Muslim immigrant woman of color, I can sometimes see cognitive biases in action with my colleagues and realize that they are oblivious to it,” Dr. Khan said. “It’s really refreshing to see this issue come up and being discussed at the [AAP] National Conference and Exhibition.”

Dr. Scarfone, Dr. Nagler and Dr. Khan have no relevant financial disclosures.

This article was updated 12/8/2020.
 

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

While it’s important not to think immediately of zebras when hearing hoofbeats, it’s just as important not to assume it’s always a horse. The delicate balance between not jumping to the seemingly obvious diagnosis without overanalyzing and overtesting is familiar to all physicians, and it’s far easier to avoid diagnostic mistakes when you understand the cognitive biases that can lead doctors astray.

Doctor sitting at her desk
sturti/Getty Images

“When these errors are made, it’s not because physicians lack knowledge, but they go down a wrong path in their thinking process,” Richard Scarfone, MD, a pediatric emergency medicine physician at the Children’s Hospital of Philadelphia, told attendees at the annual meeting of the American Academy of Pediatrics, held virtually this year. “An important point to be made here is that how physicians think seems to be much more important than what physicians know.”

Dr. Scarfone and Joshua Nagler, MD, MHPEd, director, pediatric emergency medicine fellowship program at Children’s Hospital Boston, presented a session on the cognitive biases that can trip up clinicians when making diagnoses and how to avoid them. Research shows that the rate of diagnostic error is approximately 15%. Although those findings come from studies in adults, the rates are likely similar in pediatrics, Dr. Scarfone said.

A wide range of clinical factors contribute to diagnostic errors: limited information, vague or undifferentiated symptoms, incomplete history, multiple transitions of care, diagnostic uncertainty, daily decision density, and reliance on pattern recognition, among others. Personal contributing factors can play a role as well, such as atypical work hours, fatigue, one’s emotional or affective state, a high cognitive load, and others. On top of all that, medical decision-making can be really complex on its own, Dr. Scarfone said. He compared differential diagnosis with a tree where a single leaf is the correct diagnosis.
 

System 1 thinking: Pros and cons

Dr. Scarfone and Dr. Nagler explained system 1 and system 2 thinking, two different ways of thinking that can influence decision-making that Daniel Kahneman explained in his book “Thinking, Fast and Slow.” System 1 refers to the snap judgments that rely on heuristics while system 2 refers to a more analytic, slower process.

gzorgz/iStock/Getty Images

Neither system 1 nor 2 is inherently “right or wrong,” Dr. Scarfone said. “The diagnostic sweet spot is to try to apply the correct system to the correct patient.”

Heuristics are the mental shortcuts people use to make decisions based on past experience. They exist because they’re useful, enabling people to focus only on what they need to accomplish everyday tasks, such as driving or brushing teeth. But heuristics can also lead to predictable cognitive errors.

“The good news about heuristics and system 1 thinking is that it’s efficient and simple, and we desire that in a busy practice or ED setting, but we should recognize that the trade-off is that it may be at the expense of accuracy,” Dr. Scarfone said.

The advantage to system 1 thinking is easy, simple, rapid, and efficient decision-making that rejects ambiguity. It’s also usually accurate, which rewards the approach, and accuracy increases with time based on memory, experience, and pattern recognition. Doctors develop “illness scripts” that help in identifying diagnoses.

“Illness scripts are common patterns of clinical presentations that usually lead us to a diagnostic possibility,” Dr. Scarfone said. “A classic illness script might be a 4-week-old firstborn male with forceful vomiting, and immediately your mind may go to pyloric stenosis as a likely diagnosis.” But the patient may have a different diagnosis than the initial impression your system 1 thinking leads you to believe.

“Generally, the more experience a clinician has, the more accurate they’ll be in using system 1,” he said. “Seasoned physicians are much more likely to employ system 1 than a newer physician or trainee,” which is why heuristics shouldn’t be thought of as hindrances. Dr. Scarfone quoted Kevin Eva in a 2005 review on clinical reasoning: “Successful heuristics should be embraced rather than overcome.”

A drawback to system 1 thinking, however, is thinking that “what you see is all there is,” which can lead to cognitive errors. Feeling wrong feels the same as feeling right, so you may not realize when you’re off target and therefore neglect to consider alternatives.

“When we learn a little about our patient’s complaint, it’s easier to fit everything into a coherent explanation,” Dr. Scarfone said, but “don’t ask, don’t tell doesn’t work in medicine.”

Another challenge with system 1 thinking is that pattern recognition can be unreliable because it’s dependent on context. For example, consider the difference in assessing a patient’s sore throat in a primary care office versus a resuscitation bay. “Clearly our consideration of what may be going on with the patient and what the diagnosis may be is likely to vary in those two settings,” he said.
 

 

 

System 2 thinking: Of zebras and horses

System 2 is the analytic thinking that involves pondering and seek out the optimal answer rather than the “good-enough” answer.

“The good news about system 2 is that it really can monitor system 1,” said Dr. Nagler, who has a master’s degree in health professions education. “If you spend the time to do analytic reasoning, you can actually mitigate some of those errors that may occur from intuitive judgments from system 1 thinking. System 2 spends the time to say ‘let’s make sure we’re doing this right.’ ” In multiple-choice tests, for example, people are twice as likely to change a wrong answer to a right one than a right one to a wrong one.

System 2 thinking allows for the reasoning to assess questions in the gray zone. It’s vigilant, it’s reliable, it’s effective, it acknowledges uncertainty and doubt, it can be safe in terms of providing care, and it has high scientific rigor. But it also has disadvantages, starting with the fact that it’s slower and more time-consuming. System 2 thinking is resource intensive, requiring a higher cognitive demand and more time and effort.

“Sometimes the quick judgment is the best judgment,” Dr. Nagler said. System 2 thinking also is sometimes unnecessary and counter to value-based care. “If you start to think about all the possibilities of what a presentation may be, all of a sudden you might find yourself wanting to do all kinds of tests and all kinds of referrals and other things, which is not necessarily value-based care.” When system 2 thinking goes astray, it makes us think everything we see is a zebra rather than a horse.

Sonia Khan, MD, a pediatrician in Fremont, Calif., found this session particularly worthwhile.

“It really tries to explain the difference between leaping to conclusions and learning how to hold your horses and do a bit more, to double check that you’re not locking everything into a horse stall and missing a zebra, and avoiding go too far with system 2 and thinking that everything’s a zebra,” Dr. Khan said. “It’s a difficult talk to have because you’re asking pediatricians to look in the mirror and own up, to learn to step back and reconsider the picture, and consider the biases that may come into your decision-making; then learn to extrude them, and rethink the case to be sure your knee-jerk diagnostic response is correct.”
 

Types of cognitive errors

The presenters listed some of the most common cognitive errors, although their list is far from exhaustive.

  • Affective error. Avoiding unpleasant but necessary tests or examinations because of sympathy for the patient, such as avoiding blood work to spare a needle stick in a cancer patient with abdominal pain because the mother is convinced it’s constipation from opioids. This is similar to omission bias, which places excessive concern on avoiding a therapy’s adverse effects when the therapy could be highly effective.
  • Anchoring. Clinging to an initial impression or salient features of initial presentation, even as conflicting and contradictory data accumulate, such as diagnosing a patient with fever and vomiting with gastroenteritis even when the patient has an oxygen saturation of 94% and tachypnea.
  • Attribution errors. Negative stereotypes lead clinicians to ignore or minimize the possibility of serious disease, such as evaluating a confused teen covered in piercings and tattoos for drug ingestion when the actual diagnosis is new-onset diabetic ketoacidosis.
  • Availability bias. Overestimating or underestimating the probability of disease because of recent experience, what was most recently “available” to your brain cognitively, such as getting head imaging on several vomiting patients in a row because you recently had one with a new brain tumor diagnosis.
  • Bandwagon effect. Accepting the group’s opinion without assessing a clinical situation yourself, such as sending home a crying, vomiting infant with a presumed viral infection only to see the infant return later with intussusception.
  • Base rate neglect. Ignoring the true prevalence of disease by either inflating it or reducing it, such as searching for cardiac disease in all pediatric patients with chest pain.
  • Commission. A tendency toward action with the belief that harm may only be prevented by action, such as ordering every possible test for a patient with fever to “rule everything out.”
  • Confirmation bias. Subconscious cherry-picking: A tendency to look for, notice, and remember information that fits with preexisting expectations while disregarding information that contradicts those expectations.
  • Diagnostic momentum. Clinging to that initial diagnostic impression that may have been generated by others, which is particularly common during transitions of care.
  • Premature closure. Narrowing down to a diagnosis without thinking about other diagnoses or asking enough questions about other symptoms that may have opened up other diagnostic possibilities.
  • Representation bias. Making a decision in the absence of appropriate context by incorrectly comparing two situations because of a perceived similarity between them, or on the flip side, evaluating a situation without comparing it with other situations.
  • Overconfidence. Making a decision without enough supportive evidence yet feeling confident about the diagnosis.
  • Search satisfying. Stopping the search for additional diagnoses after the anticipated diagnosis has been made.
 

 

Cognitive pills for cognitive ills

Being aware of the pitfalls of cognitive errors is the first step to avoiding and mitigating them. “It really does start with preparation and awareness,” Dr. Scarfone said before presenting strategies to build a cognitive “firewall” that can help physicians practice reflectively instead of reflexively.

First, be aware of your cognitive style. People usually have the same thinking pattern in everyday life as in the clinical setting, so determine whether you’re more of a system 1 or system 2 thinker. System 1 thinkers need to watch out for framing (relying too heavily on context), premature closure, diagnostic momentum, anchoring, and confirmation bias. System 2 thinkers need to watch out for commission, availability bias, and base rate neglect.

“Neither system is inherently right or wrong,” Dr. Scarfone reiterated. “In the perfect world, you may use system 1 to form an initial impression, but then system 2 should really act as a check and balance system to cause you to reflect on your initial diagnostic impressions.”

Additional strategies include being a good history taker and performing a meticulous physical exam: be a good listener, clarify unclear aspects of the history, and identify and address the main concern.

“Remember children and families have a story to tell, and if we listen carefully enough, the diagnostic clues are there,” Dr. Scarfone said. “Sometimes they may be quite subtle.” He recommended doctors perform each part of the physical exam as if expecting an abnormality.

Another strategy is using meta-cognition, a forced analysis of the thinking that led to a diagnosis. It involves asking: “If I had to explain my medical decision-making to others, would this make inherent sense?” Dr. Scarfone said. “If you’re testing, try to avoid anchoring and confirmation biases.”

Finally, take a diagnostic time-out with a checklist that asks these questions:

  • Does my presumptive diagnosis make sense?
  • What evidence supports or refutes it?
  • Did I arrive at it via cognitive biases?
  • Are there other diagnostic possibilities that should be considered?

One way to do this is creating a table listing the complaint/finding, diagnostic possibilities with system 1 thinking, diagnostic possibilities with system 2 thinking, and then going beyond system 2 – the potential zebras – when even system 2 diagnostic possibilities don’t account for what the patient is saying or what the exam shows.

Enough overlap exists between these cognitive biases and the intrinsic bias related to individual characteristics that Dr. Khan appreciated the talk on another level as well.

“For me, as a brown Muslim immigrant woman of color, I can sometimes see cognitive biases in action with my colleagues and realize that they are oblivious to it,” Dr. Khan said. “It’s really refreshing to see this issue come up and being discussed at the [AAP] National Conference and Exhibition.”

Dr. Scarfone, Dr. Nagler and Dr. Khan have no relevant financial disclosures.

This article was updated 12/8/2020.
 

While it’s important not to think immediately of zebras when hearing hoofbeats, it’s just as important not to assume it’s always a horse. The delicate balance between not jumping to the seemingly obvious diagnosis without overanalyzing and overtesting is familiar to all physicians, and it’s far easier to avoid diagnostic mistakes when you understand the cognitive biases that can lead doctors astray.

Doctor sitting at her desk
sturti/Getty Images

“When these errors are made, it’s not because physicians lack knowledge, but they go down a wrong path in their thinking process,” Richard Scarfone, MD, a pediatric emergency medicine physician at the Children’s Hospital of Philadelphia, told attendees at the annual meeting of the American Academy of Pediatrics, held virtually this year. “An important point to be made here is that how physicians think seems to be much more important than what physicians know.”

Dr. Scarfone and Joshua Nagler, MD, MHPEd, director, pediatric emergency medicine fellowship program at Children’s Hospital Boston, presented a session on the cognitive biases that can trip up clinicians when making diagnoses and how to avoid them. Research shows that the rate of diagnostic error is approximately 15%. Although those findings come from studies in adults, the rates are likely similar in pediatrics, Dr. Scarfone said.

A wide range of clinical factors contribute to diagnostic errors: limited information, vague or undifferentiated symptoms, incomplete history, multiple transitions of care, diagnostic uncertainty, daily decision density, and reliance on pattern recognition, among others. Personal contributing factors can play a role as well, such as atypical work hours, fatigue, one’s emotional or affective state, a high cognitive load, and others. On top of all that, medical decision-making can be really complex on its own, Dr. Scarfone said. He compared differential diagnosis with a tree where a single leaf is the correct diagnosis.
 

System 1 thinking: Pros and cons

Dr. Scarfone and Dr. Nagler explained system 1 and system 2 thinking, two different ways of thinking that can influence decision-making that Daniel Kahneman explained in his book “Thinking, Fast and Slow.” System 1 refers to the snap judgments that rely on heuristics while system 2 refers to a more analytic, slower process.

gzorgz/iStock/Getty Images

Neither system 1 nor 2 is inherently “right or wrong,” Dr. Scarfone said. “The diagnostic sweet spot is to try to apply the correct system to the correct patient.”

Heuristics are the mental shortcuts people use to make decisions based on past experience. They exist because they’re useful, enabling people to focus only on what they need to accomplish everyday tasks, such as driving or brushing teeth. But heuristics can also lead to predictable cognitive errors.

“The good news about heuristics and system 1 thinking is that it’s efficient and simple, and we desire that in a busy practice or ED setting, but we should recognize that the trade-off is that it may be at the expense of accuracy,” Dr. Scarfone said.

The advantage to system 1 thinking is easy, simple, rapid, and efficient decision-making that rejects ambiguity. It’s also usually accurate, which rewards the approach, and accuracy increases with time based on memory, experience, and pattern recognition. Doctors develop “illness scripts” that help in identifying diagnoses.

“Illness scripts are common patterns of clinical presentations that usually lead us to a diagnostic possibility,” Dr. Scarfone said. “A classic illness script might be a 4-week-old firstborn male with forceful vomiting, and immediately your mind may go to pyloric stenosis as a likely diagnosis.” But the patient may have a different diagnosis than the initial impression your system 1 thinking leads you to believe.

“Generally, the more experience a clinician has, the more accurate they’ll be in using system 1,” he said. “Seasoned physicians are much more likely to employ system 1 than a newer physician or trainee,” which is why heuristics shouldn’t be thought of as hindrances. Dr. Scarfone quoted Kevin Eva in a 2005 review on clinical reasoning: “Successful heuristics should be embraced rather than overcome.”

A drawback to system 1 thinking, however, is thinking that “what you see is all there is,” which can lead to cognitive errors. Feeling wrong feels the same as feeling right, so you may not realize when you’re off target and therefore neglect to consider alternatives.

“When we learn a little about our patient’s complaint, it’s easier to fit everything into a coherent explanation,” Dr. Scarfone said, but “don’t ask, don’t tell doesn’t work in medicine.”

Another challenge with system 1 thinking is that pattern recognition can be unreliable because it’s dependent on context. For example, consider the difference in assessing a patient’s sore throat in a primary care office versus a resuscitation bay. “Clearly our consideration of what may be going on with the patient and what the diagnosis may be is likely to vary in those two settings,” he said.
 

 

 

System 2 thinking: Of zebras and horses

System 2 is the analytic thinking that involves pondering and seek out the optimal answer rather than the “good-enough” answer.

“The good news about system 2 is that it really can monitor system 1,” said Dr. Nagler, who has a master’s degree in health professions education. “If you spend the time to do analytic reasoning, you can actually mitigate some of those errors that may occur from intuitive judgments from system 1 thinking. System 2 spends the time to say ‘let’s make sure we’re doing this right.’ ” In multiple-choice tests, for example, people are twice as likely to change a wrong answer to a right one than a right one to a wrong one.

System 2 thinking allows for the reasoning to assess questions in the gray zone. It’s vigilant, it’s reliable, it’s effective, it acknowledges uncertainty and doubt, it can be safe in terms of providing care, and it has high scientific rigor. But it also has disadvantages, starting with the fact that it’s slower and more time-consuming. System 2 thinking is resource intensive, requiring a higher cognitive demand and more time and effort.

“Sometimes the quick judgment is the best judgment,” Dr. Nagler said. System 2 thinking also is sometimes unnecessary and counter to value-based care. “If you start to think about all the possibilities of what a presentation may be, all of a sudden you might find yourself wanting to do all kinds of tests and all kinds of referrals and other things, which is not necessarily value-based care.” When system 2 thinking goes astray, it makes us think everything we see is a zebra rather than a horse.

Sonia Khan, MD, a pediatrician in Fremont, Calif., found this session particularly worthwhile.

“It really tries to explain the difference between leaping to conclusions and learning how to hold your horses and do a bit more, to double check that you’re not locking everything into a horse stall and missing a zebra, and avoiding go too far with system 2 and thinking that everything’s a zebra,” Dr. Khan said. “It’s a difficult talk to have because you’re asking pediatricians to look in the mirror and own up, to learn to step back and reconsider the picture, and consider the biases that may come into your decision-making; then learn to extrude them, and rethink the case to be sure your knee-jerk diagnostic response is correct.”
 

Types of cognitive errors

The presenters listed some of the most common cognitive errors, although their list is far from exhaustive.

  • Affective error. Avoiding unpleasant but necessary tests or examinations because of sympathy for the patient, such as avoiding blood work to spare a needle stick in a cancer patient with abdominal pain because the mother is convinced it’s constipation from opioids. This is similar to omission bias, which places excessive concern on avoiding a therapy’s adverse effects when the therapy could be highly effective.
  • Anchoring. Clinging to an initial impression or salient features of initial presentation, even as conflicting and contradictory data accumulate, such as diagnosing a patient with fever and vomiting with gastroenteritis even when the patient has an oxygen saturation of 94% and tachypnea.
  • Attribution errors. Negative stereotypes lead clinicians to ignore or minimize the possibility of serious disease, such as evaluating a confused teen covered in piercings and tattoos for drug ingestion when the actual diagnosis is new-onset diabetic ketoacidosis.
  • Availability bias. Overestimating or underestimating the probability of disease because of recent experience, what was most recently “available” to your brain cognitively, such as getting head imaging on several vomiting patients in a row because you recently had one with a new brain tumor diagnosis.
  • Bandwagon effect. Accepting the group’s opinion without assessing a clinical situation yourself, such as sending home a crying, vomiting infant with a presumed viral infection only to see the infant return later with intussusception.
  • Base rate neglect. Ignoring the true prevalence of disease by either inflating it or reducing it, such as searching for cardiac disease in all pediatric patients with chest pain.
  • Commission. A tendency toward action with the belief that harm may only be prevented by action, such as ordering every possible test for a patient with fever to “rule everything out.”
  • Confirmation bias. Subconscious cherry-picking: A tendency to look for, notice, and remember information that fits with preexisting expectations while disregarding information that contradicts those expectations.
  • Diagnostic momentum. Clinging to that initial diagnostic impression that may have been generated by others, which is particularly common during transitions of care.
  • Premature closure. Narrowing down to a diagnosis without thinking about other diagnoses or asking enough questions about other symptoms that may have opened up other diagnostic possibilities.
  • Representation bias. Making a decision in the absence of appropriate context by incorrectly comparing two situations because of a perceived similarity between them, or on the flip side, evaluating a situation without comparing it with other situations.
  • Overconfidence. Making a decision without enough supportive evidence yet feeling confident about the diagnosis.
  • Search satisfying. Stopping the search for additional diagnoses after the anticipated diagnosis has been made.
 

 

Cognitive pills for cognitive ills

Being aware of the pitfalls of cognitive errors is the first step to avoiding and mitigating them. “It really does start with preparation and awareness,” Dr. Scarfone said before presenting strategies to build a cognitive “firewall” that can help physicians practice reflectively instead of reflexively.

First, be aware of your cognitive style. People usually have the same thinking pattern in everyday life as in the clinical setting, so determine whether you’re more of a system 1 or system 2 thinker. System 1 thinkers need to watch out for framing (relying too heavily on context), premature closure, diagnostic momentum, anchoring, and confirmation bias. System 2 thinkers need to watch out for commission, availability bias, and base rate neglect.

“Neither system is inherently right or wrong,” Dr. Scarfone reiterated. “In the perfect world, you may use system 1 to form an initial impression, but then system 2 should really act as a check and balance system to cause you to reflect on your initial diagnostic impressions.”

Additional strategies include being a good history taker and performing a meticulous physical exam: be a good listener, clarify unclear aspects of the history, and identify and address the main concern.

“Remember children and families have a story to tell, and if we listen carefully enough, the diagnostic clues are there,” Dr. Scarfone said. “Sometimes they may be quite subtle.” He recommended doctors perform each part of the physical exam as if expecting an abnormality.

Another strategy is using meta-cognition, a forced analysis of the thinking that led to a diagnosis. It involves asking: “If I had to explain my medical decision-making to others, would this make inherent sense?” Dr. Scarfone said. “If you’re testing, try to avoid anchoring and confirmation biases.”

Finally, take a diagnostic time-out with a checklist that asks these questions:

  • Does my presumptive diagnosis make sense?
  • What evidence supports or refutes it?
  • Did I arrive at it via cognitive biases?
  • Are there other diagnostic possibilities that should be considered?

One way to do this is creating a table listing the complaint/finding, diagnostic possibilities with system 1 thinking, diagnostic possibilities with system 2 thinking, and then going beyond system 2 – the potential zebras – when even system 2 diagnostic possibilities don’t account for what the patient is saying or what the exam shows.

Enough overlap exists between these cognitive biases and the intrinsic bias related to individual characteristics that Dr. Khan appreciated the talk on another level as well.

“For me, as a brown Muslim immigrant woman of color, I can sometimes see cognitive biases in action with my colleagues and realize that they are oblivious to it,” Dr. Khan said. “It’s really refreshing to see this issue come up and being discussed at the [AAP] National Conference and Exhibition.”

Dr. Scarfone, Dr. Nagler and Dr. Khan have no relevant financial disclosures.

This article was updated 12/8/2020.
 

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AAP 2020

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article