Article Type
Changed
Tue, 09/12/2017 - 14:15
Display Headline
To err is human, but…

In Being Wrong,1 her treatise on the psychology of human error, Kathryn Schulz quotes William James: “Our errors are surely not such awfully solemn things.”2 Being wrong, she argues, is part of the human genome. Despite aphorisms such as “we learn from our mistakes,” we are far from accepting of mistakes in medical practice. Perhaps naively, I do not believe that our need to understand how clinical errors occur and how to avoid them is based on the fear of legal repercussion. And of course we do not want to harm our patients. But our relationship with medical errors is far more complex than that. We really don’t want to be wrong.

Dr. Atul Gawande3 has promoted using checklists and a structured system to limit errors of misapplication of knowledge. Diagnostic and therapeutic algorithms, once the province of trauma surgeons, are increasingly becoming part of internal medicine.

When I was a house officer we all had our “pocket brains” in our white coats—lists of disease complications, drug doses and interactions, causes of IgA deposition in the kidney, and treatment algorithms. But we believed (probably correctly) that our teachers expected us to commit all these facts to memory in our fleshy brains. The elitist and hubristic belief that this was uniformly possible has lingered in academic medicine, still permeating even the fabric of certification examinations. We learn that it is OK to be honest and say that we don’t know the answer, but we don’t like to have to say it. Physicians finish the academic game of Chutes and Ladders with a strong aversion to being wrong.

Younger doctors today seem more comfortable with not knowing so many facts and bits of medical trivia, being able to find answers instantly using their smart phones. But a challenge is knowing at a glance the context and veracity of the answers you find. And whether the knowledge comes from our anatomic, pocket, or cyber brain, the overarching challenge is to avoid Gawande’s error of misapplication.

In this issue of the Journal, Dr. Nikhil Mull and colleagues dissect a clinical case that did not proceed as expected. They discuss, in reference to the described patient, some of the published analyses of the clinical decision-making process, highlighting various ways that our reasoning can be led astray. Having just finished a stint on the inpatient consultation service, I wish I could have read the article a few weeks ago. A bit of reflection on how we reach decisions can be as powerful as knowing the source of the facts in our pocket brain.

Being wrong, as Schulz has written, is part of the human experience, but I don’t like it. Ways to limit the chances of it’s happening in the clinic are worth keeping on a personal checklist, or perhaps as an app on my smart phone.

References
  1. Schulz K. Being Wrong: Adventures in the Margin of Error. New York: Harper Collins, 2010.
  2. James W. The will to believe. An address to the philosophical clubs of Yale and Brown Universities, 1896. http://educ.jmu.edu//~omearawm/ph101willtobelieve.html. Accessed October 12, 2015.
  3. Gawande A. The Checklist Manifesto. How to Get Things Right. New York: Metropolitan Books, 2009.
Article PDF
Author and Disclosure Information
Issue
Cleveland Clinic Journal of Medicine - 82(11)
Publications
Topics
Page Number
714
Legacy Keywords
Cognitive bias, diagnostic error, medical error, misdiagnosis, Kathryn Schulz, Atul Gawande, William James, checklists, Brian Mandell
Sections
Author and Disclosure Information
Author and Disclosure Information
Article PDF
Article PDF

In Being Wrong,1 her treatise on the psychology of human error, Kathryn Schulz quotes William James: “Our errors are surely not such awfully solemn things.”2 Being wrong, she argues, is part of the human genome. Despite aphorisms such as “we learn from our mistakes,” we are far from accepting of mistakes in medical practice. Perhaps naively, I do not believe that our need to understand how clinical errors occur and how to avoid them is based on the fear of legal repercussion. And of course we do not want to harm our patients. But our relationship with medical errors is far more complex than that. We really don’t want to be wrong.

Dr. Atul Gawande3 has promoted using checklists and a structured system to limit errors of misapplication of knowledge. Diagnostic and therapeutic algorithms, once the province of trauma surgeons, are increasingly becoming part of internal medicine.

When I was a house officer we all had our “pocket brains” in our white coats—lists of disease complications, drug doses and interactions, causes of IgA deposition in the kidney, and treatment algorithms. But we believed (probably correctly) that our teachers expected us to commit all these facts to memory in our fleshy brains. The elitist and hubristic belief that this was uniformly possible has lingered in academic medicine, still permeating even the fabric of certification examinations. We learn that it is OK to be honest and say that we don’t know the answer, but we don’t like to have to say it. Physicians finish the academic game of Chutes and Ladders with a strong aversion to being wrong.

Younger doctors today seem more comfortable with not knowing so many facts and bits of medical trivia, being able to find answers instantly using their smart phones. But a challenge is knowing at a glance the context and veracity of the answers you find. And whether the knowledge comes from our anatomic, pocket, or cyber brain, the overarching challenge is to avoid Gawande’s error of misapplication.

In this issue of the Journal, Dr. Nikhil Mull and colleagues dissect a clinical case that did not proceed as expected. They discuss, in reference to the described patient, some of the published analyses of the clinical decision-making process, highlighting various ways that our reasoning can be led astray. Having just finished a stint on the inpatient consultation service, I wish I could have read the article a few weeks ago. A bit of reflection on how we reach decisions can be as powerful as knowing the source of the facts in our pocket brain.

Being wrong, as Schulz has written, is part of the human experience, but I don’t like it. Ways to limit the chances of it’s happening in the clinic are worth keeping on a personal checklist, or perhaps as an app on my smart phone.

In Being Wrong,1 her treatise on the psychology of human error, Kathryn Schulz quotes William James: “Our errors are surely not such awfully solemn things.”2 Being wrong, she argues, is part of the human genome. Despite aphorisms such as “we learn from our mistakes,” we are far from accepting of mistakes in medical practice. Perhaps naively, I do not believe that our need to understand how clinical errors occur and how to avoid them is based on the fear of legal repercussion. And of course we do not want to harm our patients. But our relationship with medical errors is far more complex than that. We really don’t want to be wrong.

Dr. Atul Gawande3 has promoted using checklists and a structured system to limit errors of misapplication of knowledge. Diagnostic and therapeutic algorithms, once the province of trauma surgeons, are increasingly becoming part of internal medicine.

When I was a house officer we all had our “pocket brains” in our white coats—lists of disease complications, drug doses and interactions, causes of IgA deposition in the kidney, and treatment algorithms. But we believed (probably correctly) that our teachers expected us to commit all these facts to memory in our fleshy brains. The elitist and hubristic belief that this was uniformly possible has lingered in academic medicine, still permeating even the fabric of certification examinations. We learn that it is OK to be honest and say that we don’t know the answer, but we don’t like to have to say it. Physicians finish the academic game of Chutes and Ladders with a strong aversion to being wrong.

Younger doctors today seem more comfortable with not knowing so many facts and bits of medical trivia, being able to find answers instantly using their smart phones. But a challenge is knowing at a glance the context and veracity of the answers you find. And whether the knowledge comes from our anatomic, pocket, or cyber brain, the overarching challenge is to avoid Gawande’s error of misapplication.

In this issue of the Journal, Dr. Nikhil Mull and colleagues dissect a clinical case that did not proceed as expected. They discuss, in reference to the described patient, some of the published analyses of the clinical decision-making process, highlighting various ways that our reasoning can be led astray. Having just finished a stint on the inpatient consultation service, I wish I could have read the article a few weeks ago. A bit of reflection on how we reach decisions can be as powerful as knowing the source of the facts in our pocket brain.

Being wrong, as Schulz has written, is part of the human experience, but I don’t like it. Ways to limit the chances of it’s happening in the clinic are worth keeping on a personal checklist, or perhaps as an app on my smart phone.

References
  1. Schulz K. Being Wrong: Adventures in the Margin of Error. New York: Harper Collins, 2010.
  2. James W. The will to believe. An address to the philosophical clubs of Yale and Brown Universities, 1896. http://educ.jmu.edu//~omearawm/ph101willtobelieve.html. Accessed October 12, 2015.
  3. Gawande A. The Checklist Manifesto. How to Get Things Right. New York: Metropolitan Books, 2009.
References
  1. Schulz K. Being Wrong: Adventures in the Margin of Error. New York: Harper Collins, 2010.
  2. James W. The will to believe. An address to the philosophical clubs of Yale and Brown Universities, 1896. http://educ.jmu.edu//~omearawm/ph101willtobelieve.html. Accessed October 12, 2015.
  3. Gawande A. The Checklist Manifesto. How to Get Things Right. New York: Metropolitan Books, 2009.
Issue
Cleveland Clinic Journal of Medicine - 82(11)
Issue
Cleveland Clinic Journal of Medicine - 82(11)
Page Number
714
Page Number
714
Publications
Publications
Topics
Article Type
Display Headline
To err is human, but…
Display Headline
To err is human, but…
Legacy Keywords
Cognitive bias, diagnostic error, medical error, misdiagnosis, Kathryn Schulz, Atul Gawande, William James, checklists, Brian Mandell
Legacy Keywords
Cognitive bias, diagnostic error, medical error, misdiagnosis, Kathryn Schulz, Atul Gawande, William James, checklists, Brian Mandell
Sections
Disallow All Ads
Alternative CME
Article PDF Media