Pa Patient Saf Advis 2018 Oct 31;15(Suppl 1):3-15.
Identifying and Learning from Events Involving Diagnostic Error: It’s a Process
Anesthesiology, Cardiology, Critical Care, Emergency Medicine, Nursing, Oncology, Pathology
Expand All Collapse All

Rebecca Jones, MBA, BSN, RN, CPHRM, CPPS
Director, Innovation & Strategic Partnerships
Pennsylvania Patient Safety Authority

Mary C. Magee, MSN, RN, CPHQ, CPPS
Sr. Patient Safety/Quality Analyst
Pennsylvania Patient Safety Authority

Corresponding Author
Rebecca Jones


Photofor Identifying and Learning from Events Involving Diagnostic Error: It’s a Process - stethoscope with maze  

Diagnosis involves a complex system with many team members and numerous interdependent steps, all of which can make it challenging to identify and learn from failures in the process. The Pennsylvania Patient Safety Authority sought to explore this by analyzing events involving patient harm. We queried the Pennsylvania Patient Safety Reporting System for Serious Events likely to involve diagnostic error or the diagnostic process reported during calendar year 2016. The query yielded 1,212 reports, from which we identified 138 diagnostic process failure events. We modified the diagnostic error evaluation and research (DEER) taxonomy and classified events according to process step and failure point. In the event reports, failure points in testing were involved most frequently (68.1%, n = 94 of 138) and the surgical/procedural care area predominated (21.0%, n = 29 of 138). Although the monitoring/follow-up process step accounted for just 13.0% of all events, it represented nearly half of those that resulted in death. Healthcare facilities can act now by using the modified DEER taxonomy to classify events from various sources, identify vulnerabilities in the diagnostic process, and prioritize areas of opportunity for learning and improvement.


According to the National Academy of Medicine (NAM), everyone is likely to experience at least one diagnostic error during his or her lifetime,1 and studies estimate that 12 million adults in the United States could be subject to diagnostic error each year.2,3 Diagnostic error has been identified as the leading cause of medical malpractice claims,4-6 with the majority of occurrences being classified as high severity and more than one-third resulting in death.6 Diagnostic errors are a major problem in both outpatient and inpatient care settings in the United States, but are more likely to result in death in the inpatient environment.4

Attention and action toward the problem of diagnostic error has been lacking, in part due to the difficulty in measuring errors and failures.7-10 In recent years, multiple national organizations have acknowledged that improving diagnosis is a priority for patient safety and have started to take action.

In 2015, NAM released its report, Improving Diagnosis in Health Care, noting that quality and patient safety have neglected diagnostic error and diagnostic process failures because of the lack of effective measurement related to the diagnostic process and diagnostic outcomes.1 In its 2017 annual report, "Transition to the Quality Payment Program," the Centers for Medicare and Medicaid Services (CMS) recommended initial measure development in the area of diagnostic accuracy for several medical specialties.11 That same year, the National Quality Forum (NQF) convened a multistakeholder expert committee that released a measurement framework including a set of "prioritized measurement areas" to inform and guide future work to improve diagnostic quality and safety.12

Consensus is lacking and challenges persist about defining the terms associated with diagnostic error.7,8,10,13 The term diagnosis can relate to the process or to the outcome of the process1,13 and researchers agree to disagree on whether various types of occurrences should be labeled as diagnostic errors.10,13-15 Some people do not think a diagnostic error has occurred unless a clear error in assigning the correct diagnosis to the patient's condition was the proximate cause of harm or death, even when the occurrence arose from a problem during the diagnostic process.8,16 For example, a delayed diagnosis caused by a laboratory result that was not communicated to the ordering physician may be viewed by some as a diagnostic error and by others as a communication error.

The many paradigms of diagnosis—such as severity, complication, and recurrence15—depend on the expertise of, and interactions between, various members of the diagnostic team and the larger sociotechnical healthcare system.17

Our objectives for this study were to use a clear, structured approach to mitigate the challenges associated with measuring and defining diagnostic error and to classify and analyze events reported by Pennsylvania healthcare facilities to identify priority areas for learning and improvement.


Database Query

We queried the Pennsylvania Patient Safety Reporting System (PA-PSRS) database* for Serious Events likely to involve diagnostic error or the diagnostic process reported from January 1, 2016, through December 31, 2016. Key terms were used within pertinent event types and subtypes of the PA-PSRS taxonomy. Terms included grammatical and synonymic variations of the following words: accurate, appreciate, detect, deterioration, diagnosis, discover, discrepancy, failure, follow-up, identify, incidental, incorrect, interpretation, misread, notify, and retrospective.

This query yielded 1,212 PA-PSRS reports, which provided the basis for our manual analysis.


Initially, we attempted to create an operational definition to identify PA-PSRS reports resulting from diagnostic error and sort them as either diagnostic errors or not. We applied definitions outlined by NAM1 and others8,19,20 and ultimately determined the type and amount of information needed to satisfy criteria for a single definition could not be extrapolated from many of the PA-PSRS reports.

Because of challenges associated with classifying the PA-PSRS reports using the term diagnostic error, we explored approaches to classify them according to failures in the diagnostic process as defined in the diagnostic error evaluation and research (DEER) taxonomy14,20,21 and the NAM report (e.g., failure in information gathering, failure to establish an explanation [diagnosis]).1 The DEER taxonomy was the best fit based on the type and amount of information provided in the PA-PSRS reports.

We then modified the DEER taxonomy originally developed by Schiff and colleagues in 200514 and adapted by the Pennsylvania Patient Safety Authority ("the Authority") in 201021 to more completely capture the multidisciplinary nature of the diagnostic process and to allow for more precise classification of certain events. Modifications included the following:

  • Renaming process steps—the term physical examination became physical examination/assessment, assessment became hypothesis generation, and follow-up became monitoring/follow-up
  • Expanding the testing process step to include types of testing beyond laboratory and radiology—for instance, eye-pressure test, electrocardiogram
  • Adding failure points for the testing process related to specimen delivery problems and for the monitoring/follow-up process related to monitoring and communication
  • Relocating failure points related to recognizing urgency and complications from the former assessment process step to the monitoring/follow-up process step.

See Table 1 for the full modified DEER taxonomy.

Table 1. Modified DEER Taxonomy  

Definitions and Inclusion Criteria

We each independently manually reviewed and analyzed the PA-PSRS report narratives, recommendations, and contributing factors to identify events related to diagnostic process failure. We then compared our findings and resolved discrepancies through joint analysis and consensus. Subject matter medical experts were consulted as necessary.

We created term definitions and inclusion criteria as follows:

  • Diagnostic process failure—a process step and failure point from the modified DEER taxonomy must be identified. A diagnostic process failure can occur without definitive information about the accuracy or timeliness of the diagnosis itself.
  • Unable to determine—the PA-PSRS report may relate to the diagnostic process but there is insufficient information to determine that a diagnostic process failure occurred or the process step during which it likely occurred.
  • Not a diagnostic process failure—the PA-PSRS report does not relate to the diagnostic process, no diagnostic process failure is identifiable, or it relates to a different event type altogether. The information provided in the PA-PSRS report does not meet the term definition/inclusion criteria for "diagnostic process failure" or "unable to determine."

Event Classification and Analysis

After identifying diagnostic process failure events, we reviewed narratives and other free-text fields to classify the events, determine the medical conditions involved, and identify events in which patients may have contributed to the process failure.

We classified each event in accordance with the modified DEER taxonomy based on the step in the diagnostic process during which it occurred ("process step") and specific failure that occurred during the process step ("failure point"). Although many events involved more than one process step or failure point, at times it was challenging to differentiate them from one another, and researchers have questioned whether there is value in doing so.20 For that reason, we identified only the most critical failure point for each event.

We also analyzed the events based on data in discrete fields such as harm score22 and care area. We combined like care areas into categories; for example, ambulatory surgery, hospital operating rooms, and procedural areas like interventional radiology and invasive cardiology were combined into a surgical/procedural care area category.

* PA-PSRS is a secure, web-based reporting system through which Pennsylvania hospitals, ambulatory surgical facilities, abortion facilities, and birthing centers submit reports in accordance with mandatory reporting laws outlined in the Medical Care Availability and Reduction of Error Act (Act 13 of 2002).18

A "Serious Event" is an event, occurrence, or situation involving the clinical care of a patient in a medical facility that results in death or compromises patient safety and results in an unanticipated injury requiring the delivery of additional health care services to the patient.18


Of the 1,212 PA-PSRS reports analyzed, 138 (11.4%) events met the inclusion criteria and were defined as "diagnostic process failure" events ("events"). A determination could not be made for 20 (1.7%) of the reports, which were defined as "unable to determine." The remaining 1,054 (87.0%) were excluded and defined as "not a diagnostic process failure."

Process Step and Failure Point (Modified DEER Taxonomy)

More than two-thirds of the events (68.1%; n = 94 of 138) involved failures in the testing process. Of the 11 failure points under testing, misread and misinterpreted tests accounted for about one-third (33.0%; n = 31 of 94) of the events.

The monitoring/follow-up process step accounted for 13.0% (n = 18 of 138) of all events, with failures or delays in recognizing urgency or complications being identified most frequently (33.3%, n = 6 of 18).

Figure 1 shows the percentage of diagnostic process failure events by process step, along with the percentage of testing process events by failure point.

Figure 1. Percentage of Diagnostic Process Failure Events by Process Step (N = 138)  

Table 2 includes examples of event narratives from each of the seven process steps.

Table 2. Pennsylvania Patient Safety Reporting System Event Narrative Examples  

Table 2. Pennsylvania Patient Safety Reporting System Event Narrative Examples

Harm Score

Harm scores were identified by healthcare facilities at the time of reporting. The majority of events involved temporary harm: harm scores E and F accounted for 73.2% (n = 101 of 138) of all events (Table 3). Events that contributed to or resulted in death accounted for 10.9% (n = 15).

Table 3. Diagnostic Process Failures in SErious EVents by Harm Score (N = 138)  

Although failures in the monitoring/follow-up process accounted for just 13.0% (n = 18 of 138) of all events, this step represented nearly half of all events that resulted in death (46.7%; n = 7 of 15).

Care Area

Care areas designated by reporting healthcare facilities were aggregated (Figure 2). The top two care areas identified in the events were surgical/procedural (21.0%, n = 29 of 138) and emergency department (ED; 16.7%, n = 23). Reports from discrete outpatient clinics and physician practices comprised 5.8% (n = 8) of all events; only practices and clinics under a hospital license are mandated to report into PA-PSRS. For context, of all PA-PSRS Serious Events reported during calendar year 2016, 2% (n = 152 of 7,548) were from outpatient clinics and physician practices.

Figure 2. Percentage of Diagnostic Process Failure Events by Care Area (N = 138)  

Failures in testing accounted for nearly three-quarters (72.4%, n = 21 of 29) of events reported from surgical/procedural areas, and more than half (52.2%, n = 12 of 23) reported from the ED.

Medical Condition

Figure 3 displays events by medical condition. In five instances, more than one condition was identified. Cancer was identified most frequently (22.5%, n = 31 of 138) and, of the cancers identified, lung cancer predominated (22.6%, n = 7 of 31).

Figure 3. Number of Diagnostic Process Failure Events by Medical Condition (N = 138)  

The conditions classified as "other" included hypo- and hyperglycemia, subdural hematoma, ectopic pregnancy, and esophageal diverticula. Vascular events included stroke, myocardial infarction, and pulmonary embolism. Infectious disease conditions included sepsis, appendicitis, and meningitis. Complications included those resulting from treatment or after a procedure, such as retained surgical items missed on imaging, pneumothorax, perforation, and retroperitoneal bleeding. Orthopedic conditions included fractures and dislocations.

Failures in testing accounted for 100% (n = 31 of 31) of the events related to cancer. Seven of the 11 failure points in the testing process step were involved (Figure 4). Misread or misinterpreted diagnostic tests accounted for 29.0% (n = 9) of the events.

Figure 4. Number of Cancer-Related Events by Testing Process Failure Point (N = 31)  

Patient Involvement

We identified nine instances (6.5%  of 138 events) in which patients contributed to the event. For examples, refer to event narratives marked with an asterisk in Table 2.


Using our modified version of the DEER taxonomy, we classified and analyzed events reported through PA-PSRS to identify areas of opportunity for improvement and future research related to the diagnostic process. More than two-thirds of the events involved the testing process step, and surgical/procedural areas were frequently involved. Monitoring/Follow-up failures were found to be an area of high risk, accounting for nearly half of all events resulting in death.


Our finding that failures in the testing process accounted for the largest proportion of events is consistent with other research in the field based on various methodologies. For example, a survey of physicians showed that 44% of the cases of diagnostic error involved a failure related to radiology/laboratory testing.20 In addition, in a review of 10,618 closed medical professional liability claims from 2013 through 2017, Hanscom, Small, and Lambrecht found that 52% of diagnostic error claims related to diagnostic/laboratory testing steps.6

It is unclear, based on our results, whether events in the category of testing contribute most often to diagnostic process failures or if these events are just more likely to be recognized and reported. The PA-PSRS taxonomy includes specific categories for reporting events related to testing, which may play a role. In addition, testing results, as well as the tasks associated therewith, tend to be more discernible in medical records and through various sources of data than are failures related to many of the other diagnostic process steps. Regardless, many of the failure points associated with testing can be considered "low-hanging fruit" and ripe for improvement efforts.

Surgical/Procedural Care Area

The surgical/procedural care area was the most frequent care area identified in this study. Although surgical/procedural areas do not typically rise to the top in studies of diagnostic error across provider types, such as those involving medical malpractice claims,5,6 care areas identified by reporting healthcare facilities do not necessarily reflect the healthcare provider or team "responsible."

For example, for specimen delivery problems originating from surgical/procedural areas, some healthcare facilities listed the location as operating room, while others selected the laboratory. Moreover, the surgical/procedural care area in our study includes events from hospitals and ambulatory surgical facilities, which likely contributed to this care area predominating.

Within the surgical/procedural care area, failures in the testing process step were most common. Because so many tests originate in surgical/procedural areas, testing processes and teamwork within these areas and across other departments and disciplines are vital to improving the diagnostic process and may be a great place to begin improvement work.


Although every step in the diagnostic process depends on contributions from the entire diagnostic team—including nurses, technologists, respiratory therapists, social workers, and others1,23-26—these contributions are most visible during the testing and monitoring/follow-up process steps. We have already identified opportunities for improvement work related to testing; given that more than one-third of the events classified as failures in the category monitoring/follow-up resulted in patient death, this area may warrant further attention as well.

We chose to use the term monitoring/follow-up for clarity, although the contributions made by healthcare team members during this step of the process go well beyond monitoring. Traditionally, reference to a diagnostic error meant there was a mistake in identifying the primary cause of a patient's signs and symptoms; in other words, an inaccurate or delayed diagnosis of a new condition. However, especially in acute care settings, complications can arise from treatment or after a procedure for the original problem, or a separate clinical issue can present during the course of care.14 The entire healthcare team must be vigilant about monitoring the patient, recognizing and communicating changes in a timely manner, and following up as appropriate.27

Patient Involvement

Patients also play a role as key members of the diagnostic team.1,23,25,28 Although the patient does not bear sole responsibility for a successful diagnostic process, during certain process steps, the patient's participation is vital. In our study, we identified nine events in which the patient contributed to the process failure, the majority of which involved the process steps of access/presentation or history.

Hypothesis Generation

The process step hypothesis generation accounted for 9% of the events in our study. Although not a direct comparison in terms of methodology or definitions, studies based on data from medical record reviews and provider interviews,19 malpractice claims,6 and voluntary reports from ED physicians29 appear to reflect a much higher proportion of occurrences involving this process. However, it is challenging to distinguish the cognitive aspects of hypothesis generation from not only the cognitive aspects involved in other steps of the process, but from all of the organizational, environmental, and other work-system factors that might have impacted cognition at the time.

Although these findings represent a lower-than-expected rate of reporting for events involving hypothesis generation, we cannot be sure whether a lack of reporting by physicians contributed, because PA-PSRS reports do not capture the role of the reporter. However, some experts in the field have recognized insufficient incident reporting by physicians7 and emphasize that physician reports of diagnostic errors can call attention to occurrences that may otherwise go unidentified.30

Improvement Opportunities

Clearly, there is no shortage of opportunities to improve the diagnostic process. We have identified some priority areas of focus based on data collected across Pennsylvania, and healthcare facilities can use the modified DEER taxonomy locally to identify trends, set priorities, and create improvement strategies. Sources of information that can serve as a starting point for identifying occurrences to classify include incident reports, employee and patient surveys, patient and family complaints and grievances, medical record reviews, morbidity and mortality and peer reviews, malpractice claims, insurance claims, and clinical surveillance. These sources are complementary and can be combined for the most complete understanding.1,10,31

In addition, Table 4 lists a number of resources that may help to address some of the process issues identified in this study.

Table 4. Resources to Improve Aspects of the Diagnostic Process  


This analysis is based on facility-reported Serious Events and does not quantify diagnostic error in Pennsylvania. Despite mandatory reporting laws,18 the data are subject to the limitations of self-reporting and the complexities of the reporting system. The Authority's findings might have differed had the analysis included Incidents.*

There is no explicit taxonomy available in PA-PSRS for reporting diagnostic errors or diagnostic process failures. PA-PSRS reports were analyzed based on information in the free-text narratives and structured fields. Reports were neither discussed with associated caregivers nor correlated with medical records.
* An "Incident" is an event, occurrence, or situation involving the clinical care of a patient in a medical facility which could have injured the patient but did not either cause an unanticipated injury or require the delivery of additional health care services to the patient.18


Although the diagnostic process is extensive and complex, this study provides key insights and areas worth exploring further. While the Authority has access to a breadth of valuable data from PA-PSRS events reported by healthcare facilities across the Commonwealth, hospitals and health systems have much deeper and richer sources of data related to each unique event. Using the modified DEER taxonomy as a starting point, the Authority, along with healthcare facility leaders, healthcare providers, and all interested stakeholders, can work together to create an effective and reliable system for measuring events involving diagnostic process failures, thus strengthening our ability to recognize patterns and prioritize areas of opportunity for learning and improvement both at the facility level and broadly across Pennsylvania and beyond.


  1. National Academies of Sciences, Engineering, and Medicine. Improving diagnosis in health care. Washington (DC): The National Academies Press; 2015. Also available:  
  2. Singh H, Meyer AN, Thomas EJ. The frequency of diagnostic errors in outpatient care: estimations from three large observational studies involving US adult populations. BMJ Qual Saf. 2014 Sep;23(9):727-31. Also available: PMID: 24742777
  3. Misdiagnosed: docs' mistakes affect 12 million a year. [internet]. New York (NY): NBC News; 2014 Apr 16 [accessed 2018 Jun 28]. [5 p]. Available:  
  4. Saber Tehrani AS, Lee H, Mathews SC, Shore A, Makary MA, Pronovost PJ, Newman-Toker DE. 25-Year summary of US malpractice claims for diagnostic errors 1986-2010: an analysis from the National Practitioner Data Bank. BMJ Qual Saf. 2013 Aug;22(8):672-80. Also available: PMID: 23610443
  5. Troxel DB. The Doctor's Advocate. Diagnostic error in medical practice by specialty. [internet]. Napa (CA): The Doctors Company; 2014 [accessed 2018 Jun 05]. [7 p]. Available:  
  6. Hanscom R, Small M, Lambrecht A. Diagnostic accuracy: room for improvement. Boston (MA): Coverys; 23 p. Also available:  
  7. Graber ML. The incidence of diagnostic error in medicine. BMJ Qual Saf. 2013 Oct;22 Suppl 2:ii21-ii27. Also available: PMID: 23771902
  8. Singh H. Editorial: Helping health care organizations with defining diagnostic errors as missed opportunities in diagnosis. Jt Comm J Qual Patient Saf. 2014 Mar;40(3):99-101. Also available: PMID: 24730204
  9. Olson APJ, Graber ML, Singh H. Tracking progress in improving diagnosis: a framework for defining undesirable diagnostic events. J Gen Intern Med. 2018 Jan 29;33(7):1187-91. Also available: PMID: 29380218
  10. Zwaan L, Singh H. The challenges in defining and measuring diagnostic error. Diagnosis. 2015 Jun;2(2):97-103. Also available: PMID: 26955512
  11. Health Services Advisory Group, Inc. CMS Quality Measure Development Plan: supporting the transition to the Quality Payment Program. 2017 annual report. Baltimore (MD): Center for Clinical Standards and Quality, Centers for Medicare & Medicaid Services (CMS); 2017 Jun 2. 74 p. Also available:
  12. Improving diagnostic quality and safety. Washington (DC): National Quality Forum; 2017 Sep. 77 p. Also available:  
  13. Newman-Toker DE. A unified conceptual model for diagnostic errors: underdiagnosis, overdiagnosis, and misdiagnosis. Diagnosis. 2014 Jan;1(1):43-8. Also available: PMID: 28367397
  14. Schiff GD, Kim S, Abrams R, Cosby K, Lambert B, Elstein AS, Hasler S, Krosnjar N, Odwazny R, Wisniewski MF, McNutt RA. Diagnosing diagnosis errors: lessons from a multi-institutional collaborative project. In: Henriksen K, Battles JB, Marks ES, et al, editor(s). Advances in patient safety: from research to implementation. Vol. 2, concepts and methodology. Rockville (MD): Agency for Healthcare Research and Quality; 2005 Feb. p. 255-78. Also available:  
  15. Schiff GD. Minimizing diagnostic error: the importance of follow-up and feedback. Am J Med. 2008 May;121(5 Suppl):S38-42. Also available: PMID: 18440354
  16. Henriksen K, Brady J. The pursuit of better diagnostic performance: a human factors perspective. BMJ Qual Saf. 2013 Oct;22 Suppl 2:ii1-ii5. Also available: PMID: 23704082
  17. Singh H, Sittig DF. Advancing the science of measurement of diagnostic errors in healthcare: the Safer Dx framework. BMJ Qual Saf. 2015 Feb;24(2):103-10. Also available: PMID: 25589094
  18. Medical Care Availability and Reduction of Error (MCARE) Act of March 20, 2002, P.L. 154, No. 13, Cl. 40. Available:
  19. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005 Jul 11;165(13):1493-9. Also available: PMID: 16009864
  20. Schiff GD, Hasan O, Kim S, Abrams R, Cosby K, Lambert BL, Elstein AS, Hasler S, Kabongo ML, Krosnjar N, Odwazny R, Wisniewski MF, McNutt RA. Diagnostic error in medicine: analysis of 583 physician-reported errors. Arch Intern Med. 2009 Nov 09;169(20):1881-7. Also available: PMID: 19901140
  21. Diagnostic error in acute care. Pa Patient Saf Advis. 2010 Sep;7(3):76-86. Also available:  
  22. Pennsylvania Patient Safety Authority harm score taxonomy. [internet]. Pennsylvania Patient Safety Authority; 2015 [accessed 2017 Jun 08]. Available:;12(1)/PublishingImages/taxonomy.pdf.  
  23. Schiff GD. Diagnosis and diagnostic errors: time for a new paradigm. BMJ Qual Saf. 2014 Jan;23(1):1-3. Also available: PMID: 24050984
  24. Thomas DB, Newman-Toker DE. Diagnosis is a team sport - partnering with allied health professionals to reduce diagnostic errors. Diagnosis (Berl). 2016 Jun 01;3(2):49-59. Also available: PMID: 29536891
  25. Graber ML, Rusz D, Jones ML, Farm-Franks D, Jones B, Cyr Gluck J, Thomas DB, Gleason KT, Welte K, Abfalter J, Dotseth M, Westerhaus K, Smathers J, Adams G, Laposata M, Eichbaum Q, Nabatchi T, Compton M. The new diagnostic team. Diagnosis. 2017;4(4):225-38. Also available:  
  26. Gleason KT, Davidson PM, Tanner EK, Baptiste D, Rushton C, Day J, Sawyer M, Baker D, Paine L, Dennison Himmelfarb CR, Newman-Toker DE. Defining the critical role of nurses in diagnostic error prevention: a conceptual framework and a call to action. Diagnosis. 2017;4(4):201-24.
  27. Considine J. Nurses, diagnosis and diagnostic error. Diagnosis. 2017;4(4):197-9. Also available: PMID: 29536936
  28. McDonald KM, Bryce CL, Graber ML. The patient is in: patient involvement strategies for diagnostic error mitigation. BMJ Qual Saf. 2013 Oct;22 Suppl 2:ii33-ii39. Also available: PMID: 23893394
  29. Okafor N, Payne VL, Chathampally Y, Miller S, Doshi P, Singh H. Using voluntary reports from physicians to learn from diagnostic errors in emergency medicine. Emerg Med J. 2016 Apr;33(4):245-52. Also available: PMID: 26531860
  30. Graber ML, Trowbridge R, Myers JS, Umscheid CA, Strull W, Kanter MH. The next organizational challenge: finding and addressing diagnostic error. Jt Comm J Qual Patient Saf. 2014 Mar;40(3):102-10. PMID: 24730205
  31. Levtzion-Korach O, Frankel A, Alcalai H, Keohane C, Orav J, Graydon-Baker E, Barnes J, Gordon K, Puopulo AL, Tomov EI, Sato L, Bates DW. Integrating incident data from five reporting systems to assess patient safety: making sense of the elephant. Jt Comm J Qual Patient Saf. 2010 Sep;36(9):402-10. PMID: 20873673
  32. Patient's Toolkit for Diagnosis. [internet]. Evanston (IL): Society to Improve Diagnosis in Medicine; [accessed 2018 Sep 18]. [3 p]. Available:  
  33. Lost surgical specimens, lost opportunities. PA PSRS Patient Saf Advis. 2005 Sep;2(3):1-5. Also available:  
  34. In vitro hemolysis: delays may pose safety issues. PA PSRS Patient Saf Advis. 2007 Jun;4(2):64-6. Also available:  
  35. Liberatore K. The link between health IT and laboratory test problems. Pa Patient Saf Advis. 2018 Oct;15(Suppl. 1) Also available:  
  36. Clinical reasoning toolkit. [internet]. Evanston (Il): Society to Improve Diagnosis in Medicine; [accessed 2018 Aug 09]. [4 p]. Available:  
  37. Pennsylvania Patient Safety Authority. Alarm interventions during medical telemetry monitoring: a failure mode & effects analysis. Harrisburg (PA): Pennsylvania Patient Safety Authority; 2008 Mar. 50 p. Also available:
  38. Connecting remote cardiac monitoring issues with care areas. Pa Patient Saf Advis. 2009 Sep;6(3):79-83. Also available:
  39. Lacker C. Physiologic alarm management. Pa Patient Saf Advis. 2011 Sep;8(3):105-8. Also available:  
  40. Managing patient access and flow in the emergency department to improve patient safety. Pa Patient Saf Advis. 2010 Dec;7(4):123-34. Also available:
  41. Magee MC. Patient flow in the ED phase II - diagnostic evaluation through disposition decision. Pa Patient Saf Advis. 2015 Mar;12(1):7-18. Also available:
  42. Magee MC. Patient flow in the emergency department: phase III - after disposition decision through departure. Pa Patient Saf Advis. 2015 Dec;12(4):132-40. Also available:
  43. Feil M. Warming blankets and patient harm. Pa Patient Saf Advis. 2017 Dec;14(4) Also available:
  44. Duncan KD, McMullan C, Mills BM. Early warning systems: the next level of rapid response. Nursing. 2012 Feb;42(2):38-44; quiz 45. Also available: PMID: 22227597
  45. Clarke SP, Aiken LH. Failure to rescue. Am J Nurs. 2003 Jan;103(1):42-7. PMID: 12544057
  46. Silber JH, Williams SV, Krakauer H, Schwartz JS. Hospital and patient characteristics associated with death after surgery. A study of adverse occurrence and failure to rescue. Med Care. 1992 Jul;30(7):615-29. PMID: 1614231

The Pennsylvania Patient Safety Advisory may be reprinted and distributed without restriction, provided it is printed or distributed in its entirety and without alteration. Individual articles may be reprinted in their entirety and without alteration, provided the source is clearly attributed.

Current and previous issues are available online at