Pa Patient Saf Advis 2018 Oct 31;15(Suppl 1):68-9.
Acquiring Diagnostic Skill: Understanding the Decision Making Processes Used by Experts
Anesthesiology, Cardiology, Critical Care, Emergency Medicine, Nursing, Oncology, Pathology
Expand All Collapse All
​Authors

Ellen S. Deutsch
Editor, Pennsylvania Patient Safety Advisory
Medical Director, Pennsylvania Patient Safety Authority

Laura G. Militello
CEO, Applied Decision Science, LLC
VP of Research, Unveil, LLC

Disclosure: Ms. Militello declares that she has no relevant or material financial interests to disclose.

Corresponding Author
Ellen S. Deutsch

Introduction

Photo for Acquiring Diagnostic Skill: Understanding the Decision Making Processes Used by Experts - surgeons

How is an art expert certain that a famous painting is not an elaborate forgery?

How can an antiquities expert determine that an artifact is more contemporary than claimed?

Experts use a combination of rapid recognition and deliberate analysis, described as System 1 and System 2 thinking.1 System 1 thinking, also called "fast" thinking or intuition,2 involves pattern recognition as skilled practitioners quickly recognize typical situations. An art expert can rapidly recognize common forgeries and copies that appear genuine to the untrained eye.3 Even more baffling, an art expert may recognize a subtle anomaly that many others miss, noting something that just doesn't seem right in a high-quality copy.

This ability to recognize the typical and detect anomalies is key to diagnostic skills in the clinical domain as well as in the art domain. This is that "gut feeling"2 or "Spidey sense" that clinicians describe when they walk into a room and "just know" that this patient is sick.

In some situations, however, System 1 thinking is not enough. When clinicians engage in a more deliberate analytic process to diagnose a patient's condition, Kahneman refers to this as System 2 or "slow" thinking.1 Clinicians often generate hypotheses and seek information to confirm or rule out each hypothesis, particularly when faced with a novel presentation, uncertainty, or complexity.

Determining the correct diagnosis for a patient has always been essential in healthcare, but recent attempts to understand how the diagnostic process occurs have received increased attention.4,5 Under what circumstances is immediate recognition of a pattern, an intuitive process,6 a good thing? Does it reflect experience and knowledge? Or is it error-prone, subject to bias and presumptions?7

Conversely, when is a methodical approach more appropriate? Should the clinician work through an organized algorithm to ensure a thorough review of all potential diagnoses? Or is this process unnecessarily laborious when the correct diagnosis might be obvious?

Novice and Expert Decision-Making in Diagnosis

Both System 1 and System 2 thinking can be appropriate, depending on the experience level of the clinician, the complexity or rarity of the medical condition, and the time and resources available. However, descriptive studies suggest that novice and expert clinicians use different processes to develop diagnoses.

Novice clinicians tend to depend more on quantifiable, verifiable objective data, such as vital signs and laboratory test results. They seek affirmation from other members of the healthcare team8—explicitly or implicitly—to verify their assessments.

Experts generally have more confidence in their assessments. They develop hypotheses based on both objective data and less quantifiable information, such as changes in mental status, abnormal skin temperature or color, or an "ill appearance."8 They respond to their own gut feelings. Experts sequentially test and revise their assessments and refine their diagnoses based on the patient's response to interventions, as an ongoing process. Experts are sensitive to circumstances in which test results or the patient's evolving medical condition violates their expectations.8

For experts, even System 2 thinking is driven by their own experiences, as they develop and test hypotheses based on recognition of typical and anomalous data. An expert's experience informs what he or she considers relevant and useful data to assess, track, and test to form hypotheses.9 Studies of expertise across a range of domains suggest that exposure to many cases or a "deep experience-base" is needed to develop these critical skills.10,11

Although experts appear to move between System 1 and System 2 thinking seamlessly, it is sometimes assumed that the methodical, systematic approach to diagnosis is inherently the better process. Indeed, that is how medical students and residents are taught. For novices, this may be necessary because they have not yet built up a repository of clinical experiences to support a sufficient variety of patterns needed for recognition. Training and testing for physicians often involves developing an extensive and thorough "differential diagnosis" with justification or rebuttal of a wide variety of possible diagnostic options (e.g., congenital, traumatic, infectious, neoplastic). The stepwise exercise may be valuable for a novice, but it can be frustrating for an expert when faced with an immediately recognizable condition she has seen and treated many times before.

Helping Clinicians Improve their Skills

If experiences can help novices develop expertise, can opportunities for experiences be optimized? Can educational experiences help novices deepen both System 1 and System 2 knowledge? Conversely, can experts develop techniques to avoid premature diagnostic closure?

Simulation, or training and practice in lifelike situations, can be a useful option.

Broadly, simulation offers the ability to provide directed practice during experiences that support identifying and managing a variety of medical conditions at the relative convenience of the learners and faculty, without direct risk to patients.12-14 Simulation is effective enough to have gained recognition as "a central thread in the fabric of medical education."14

Simulation-based education ranges from practice for isolated skills to participation in complex immersive scenarios. Simulators (e.g., simulation devices) may be high or low technology, physical, virtual, human, or combinations. Simulators are incorporated into simulations. Current limitations in the ability of simulators to demonstrate evolving changes in a patient's mental status, skin color, or lesions and rashes may be mitigated with the growth of simulation incorporating augmented reality.15

Simulation-based facilitated experiential learning, including debriefings, is typically implemented in safe and supportive environments that have been proven to help students learn better.14 Simulations can therefore provide opportunities to increase the learners' exposure to recognizable and retrievable diagnostic patterns, as well as opportunities for faculty to elicit and reinforce correct diagnostic assessments.

For example, trainees at the Children's Hospital of Philadelphia participate in simulations designed to help them distinguish different types of shock and cardiac arrhythmias. Trainees at the ORL Emergencies Boot Camp participate in simulations designed to help them recognize and manage "cannot intubate, cannot ventilate" patient care emergencies;16 one participant published a testimonial that the lessons he learned saved a patient's life.17 

Much contemporary literature focuses on strategies intended to prevent erroneous diagnoses, with the implication that the weaknesses of humans attempting to achieve correct diagnoses can be overcome if sufficient constraints are implemented. But it seems obvious that a focus on teaching learners what their limitations are might not result in enhancing their strengths. Investigators have demonstrated that the performance of learners who are debriefed about their successes as well as their failures improves significantly compared with the performance of learners who are debriefed only about failures.18

Other research suggests that debriefing strategies that help novices see what experts noticed and how experts made sense of the situation support skill acquisition.19 Contemporary Safety-II principles, which seek to "enable things to go right more often,"11 align with the premise that providing appreciation and reinforcing successful identification of diagnostic patterns will improve the diagnostic process. Simulation-based training can promote effective use of System 1 and System 2 thinking because it provides an opportunity to extend a clinician's experience base and encourages reflection with timely feedback and exposure to expert perspectives.

Developing Expertise

Determining the correct diagnosis for a patient may be extremely easy or incredibly difficult. Simulation provides a mechanism that supports directed practice. The debriefing provides a mechanism to amend incorrect diagnoses, optimize correct diagnoses, and expose the learner to expert diagnostic processes. Reinforcing correct diagnoses and providing a view into expert thought processes are important components of helping novices develop expertise and helping experts fine-tune their skills.

Notes

  1. Kahneman D. Thinking, fast and slow. New York (NY): Farrar, Straus and Giroux; 2011. 533 p.
  2. Klein G. The power of intuition: how to use your gut feelings to make better decisions at work. New York (NY): Doubleday; 2003.
  3. Klein G, Hoffman RR. Seeing the invisible: the perceptual cognitive aspects of expertise. In: Rabinowitz M, editor(s). Cognitive science foundations of instruction. Hillsdale (NJ): Lawrence Erlbaum Associates; 1993. p. 203-26.
  4. Henriksen K, Brady J. The pursuit of better diagnostic performance: a human factors perspective. BMJ Qual Saf. 2013 Oct;22 Suppl 2:ii1-ii5. Also available: http://dx.doi.org/10.1136/bmjqs-2013-001827. PMID: 23704082
  5. Graber ML, Kissam S, Payne VL, Meyer AN, Sorensen A, Lenfestey N, Tant E, Henriksen K, Labresh K, Singh H. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual Saf. 2012 Jul;21(7):535-57. Also available: http://dx.doi.org/10.1136/bmjqs-2011-000149. PMID: 22543420
  6. Kahneman D, Klein G. Conditions for intuitive expertise: a failure to disagree. Am Psychol. 2009 Sep;64(6):515-26. Also available: http://psycnet.apa.org/doiLanding?doi=10.1037%2Fa0016755. PMID: 19739881
  7. Croskerry P. To err is human--and let's not forget it. CMAJ. 2010 Mar 23;182(5):524. Also available: http://dx.doi.org/10.1503/cmaj.100270. PMID: 20231338
  8. Patterson MD, Militello LG, Bunger A, Taylor RG, Wheeler DS, Klein G, Geis GL. Leveraging the critical decision method to develop simulation-based training for early recognition of sepsis. J Cogn Eng Decis Mak. 2016;10(1):36-56. Also available: DOI: 10.1177/1555343416629520.
  9. Klein G, Moon B, Hoffman RR. Making sense of sensemaking 1: alternative perspectives. IEEE Intel Syst. 2006;21(4):70-3.
  10. Ericsson KA. Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med. 2008 Nov;15(11):988-94. Also available: http://dx.doi.org/10.1111/j.1553-2712.2008.00227.x. PMID: 18778378
  11. Braithwaite J, Wears RL, Hollnagel E. Resilient health care: turning patient safety on its head. Int J Qual Health Care. 2015 Oct;27(5):418-20. Also available: http://dx.doi.org/10.1093/intqhc/mzv063. PMID: 26294709
  12. Kneebone R. Simulation in surgical training: educational issues and practical implications. Med Educ. 2003 Mar;37(3):267-77. PMID: 12603766
  13. Deutsch ES. Simulation in otolaryngology: smart dummies and more. Otolaryngol Head Neck Surg. 2011 Dec;145(6):899-903. Also available: http://dx.doi.org/10.1177/0194599811424862. PMID: 21965444
  14. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Med Educ. 2010 Jan;44(1):50-63. Also available: https://doi.org/10.1111/j.1365-2923.2009.03547.x. PMID: 20078756
  15. Militello LG, Sushereba CE. Augmented reality: the future of medic training. Proc Int Symp Human Factors Ergon Health Care. 2018.
  16. Deutsch ES, Malloy KM, Malekzadeh S. Simulation-based otorhinolaryngology emergencies boot camp: Part 3: Complex teamwork scenarios and conclusions. Laryngoscope. 2014 Jul;124(7):1570-2. Also available: http://dx.doi.org/10.1002/lary.24570. PMID: 24375442
  17. Tompkins JJ. Use of simulation boot camps to train junior otolaryngology residents: a resident's testimonial. JAMA Otolaryngol Head Neck Surg. 2014 May;140(5):395-6. Also available: http://dx.doi.org/10.1001/jamaoto.2014.202. PMID: 24676714
  18. Ellis S, Davidi I. After-event reviews: drawing lessons from successful and failed experience. J Appl Psychol. 2005 Sep;90(5):857-71. Also available: http://dx.doi.org/10.1037/0021-9010.90.5.857. PMID: 16162059
  19. Klein G, Borders J. The Shadowbox approach to cognitive skills training: an empirical evaluation. J Cogn Eng Decis Mak. 2016;10(3):268-80.
PSA LOGO

The Pennsylvania Patient Safety Advisory may be reprinted and distributed without restriction, provided it is printed or distributed in its entirety and without alteration. Individual articles may be reprinted in their entirety and without alteration, provided the source is clearly attributed.

Current and previous issues are available online at http://patientsafety.pa.gov.

©2018 Pennsylvania Patient Safety Authority