Laboratory testing plays an important role in the diagnostic process; thus, timely and accurate laboratory testing processes are essential to delivering safe patient care. Health information technology (health IT) has been introduced as a mechanism to reduce error and improve efficiency throughout the laboratory testing process. Yet, health IT was indicated as a contributing factor to laboratory-test problems in 775 evaluable events reported January 2016 through December 2017 through the Pennsylvania Patient Safety Reporting System. The pre-pre analytical phase, involving test ordering, drawing, and transportation to the laboratory, accounted for the largest number of events. Problems due to incorrect human data entry (i.e., wrong input) and incorrect machine output or display of data (i.e., output/display error) were most common. Risk reduction strategies include assembling a multidisciplinary team to evaluate and improve the total testing process, simplifying test names in order menus, monitoring the display of results, and establishing a communication plan for incomplete specimens, canceled specimens, and amended results.
Laboratory results are an integral part of decision making during the diagnostic process. Health information technology (health IT) has been introduced throughout the laboratory testing process to improve the accuracy and efficiency of tasks once reliant on paper, manual effort, phone calls, and couriers.
Health IT has transformed the laboratory test process, from ordering and collection of tests through the communication and interpretation of results. As the role of health IT in the laboratory testing process has advanced, so too has the possibility for patient safety to be compromised in new and unexpected ways.1
The introduction of health IT throughout the laboratory testing process has complemented an evolving appreciation of the complex, interrelated processes that extend beyond the walls of the laboratory. Regulatory guidelines, patient safety priorities, and a focus on patient-centered care have supported this broader, systems approach to evaluating and improving patient care.2-4
Recognizing the importance of studying the effects of health IT on patient safety and the diagnostic process, this analysis evaluates reports that associated health IT with laboratory test problems submitted by Pennsylvania healthcare facilities over a two-year period.
Pennsylvania Patient Safety Authority analysts queried the Pennsylvania Patient Safety Reporting System (PA-PSRS) for events submitted from January 1, 2016, through December 31, 2017, that were submitted under the event subtype "laboratory test problem" and indicated health IT was a contributing factor.*
Analysts manually reviewed all reports and categorized the phase of laboratory testing and health IT problems. The phase of laboratory testing was categorized using the five phases of laboratory testing described by Plebani.2 The steps within each phase are outlined in Table 1.
The health IT problems described in event narratives were categorized using a health IT–specific taxonomy developed by Magrabi and co-authors.5 This taxonomy classifies problems as being human- or machine-related, and occurring at the point of data input, transfer, output, or at a broader general technical level (Figure 1). When specifically reported in the event narrative, contributing factors from the Magrabi taxonomy were also applied. A single event could be tagged with multiple categories from the Magrabi taxonomy. Reports were excluded from the analysis based on the following criteria: description of a non-laboratory diagnostic test problem, insufficient detail to categorize the phase of laboratory testing, or insufficient detail to categorize the health IT problem.
After categorizing all evaluable reports using adaptations of both the Plebani and Magrabi taxonomies, the events were analyzed by reported harm score, phase of the laboratory testing process, and health IT problem. Analysts conducted a review of the literature to ascertain background on laboratory testing errors, the role of health IT in laboratory testing, and strategies to reduce risk.
* Note: The question, "Did health IT cause or contribute to this event?" was added to the PA-PSRS database in April 2015.6
The query identified 864 reports, of which 6 described non-laboratory test problems, and 83 had insufficient detail to categorize the phase of testing or health IT problem. The remaining 775 evaluable events are described below.
More than 99% of events were reported as Incidents (Figure 2). The most common harm score in all phases of the laboratory testing process was C (i.e., event reached the individual but did not cause harm and did not require increased monitoring).7 Three events resulted in patient harm (i.e., Serious Events): one pre-analytical event involved a test ordered but not performed, one analytical event involved wrong results, and one post-analytical event involved wrong results. The pre-analytical event, in which an ordered test was not performed, was associated with the patient's death (i.e., harm score I).
Laboratory Testing Phase
Based on the Plebani taxonomy addressing the phases of laboratory testing,2 the pre-pre analytical phase was associated with the largest number of events involving laboratory test problems. The analytical phase was associated with the fewest (Figure 3).
Health IT Problems
Based on the Magrabi taxonomy addressing health IT defects,5 review of the 775 PA-PSRS events identified 917 problems (Table 2). Human-computer interactions were involved in more than half (54.7%) of categories assigned; the remaining 45.3% were machine-related. Problems involving information input were most common (46.5%; n = 426 of 917), with 72.3% occurring during the pre-pre analytical phase (n = 308 of 426). Problems involving information output were the next most common (34.8%; n = 319 of 917).
At a more granular level, the most common event report categories assigned were problems due to wrong input, output/display error, system interface issues, and (humans) didn't do (Table 2).
Problems related to wrong input were identified in the largest number of event reports (32.0%; n = 293 of 917), with 73.4% occurring in the pre-pre analytical phase (n = 215 of 293). Wrong input problems in the post-analytical phase were the next most common (18.8%; n = 55 of 293). Wrong inputs during test ordering and resulting were both mechanical (e.g., typing errors, wrong drop-down menu selection) and cognitive in nature (e.g., acting upon incorrect information, misinterpreting information).
Examples of event reports categorized as wrong inputs in the pre-pre analytical phase are as follows:*
Incorrect provider was entered into the electronic orders system as ordering provider. Surgical report was sent to wrong provider.
Blood culture submitted. Source listed on requisition was "Arterial Catheter, Blood." Per collector, source was actually "Umbilical Artery." Sample was reordered
with corrected source and processed.
An urgent urinalysis was ordered by the physician.
The order entry indicated the specimen was already collected. The specimen had not already been collected and should have been entered as pending to activate a collection workflow. Followed up with physician regarding correct order entry.
Specimen collection status was set to "unit collect."
Patient did not have a central device that would permit a unit collection. Nurse changed the collection status to "lab collect."
Examples of event reports categorized as wrong inputs in the post-analytical phase are as follows:
When entering manual differential results into the LIS [laboratory information system], tech accidentally inverted two results.
Test was negative but tech entered result as positive.
Data output/display errors were identified in the second largest number of event reports (19.7%; n = 181 of 917). More than half of these errors occurred in the pre-pre analytical phase (55.2%; n = 100 of 181) and a third occurred in the post-analytical phase (32.6%; n = 59 of 181). The types of technology involved in output/display errors included electronic health records (EHR), laboratory information systems (LISs), printers, and mobile applications.
Examples of event reports categorized as data output/display errors in the pre-pre analytical phase are as follows:
Specimen sent to lab with incomplete patient information due to printer malfunction - cut off patient data.
STAT CBC [complete blood count] ordered by physician in computer. When nurse went in to review orders, the order was erroneously displayed as Routine. Help desk ticket entered to correct computer programming.
In the post-analytical phase, rule-based computer interventions such as autoverification and result flags were linked to output/display errors. Examples of such event reports are as follows:
Patient's blood cultures were positive. Result was not flagged in red font color in EHR.
BHCG [beta-human chorionic gonadotropin] result autoverified prior to autodilution.
System Interface Issues
System interface issues were identified in the third largest number of event reports (7.5%; n = 69 of 917), with all but one occurring in pre-pre analytical, pre-analytical, and post-analytical phases. Described as information that did not "cross over" or could not be "seen," such problems required additional communication between clinicians and the laboratory to resolve.
An example of an event report categorized as a system interface issue in the pre-pre analytical phase is as follows:
Order for potassium lab entered. Nurse called phlebotomy who stated that the order was not transferred to them.
An example of an event report categorized as a system interface issue in the pre-analytical phase is as follows:
Urinalysis ordered, collected, and sent to Laboratory. Per the Laboratory, the order could
not be seen and thus the sample was not processed.
An example of an event report tagged as a system interface issue in the post-analytical phase is as follows:
Result did not transfer from lab system to clinical system in physician office.
Failure to update information (i.e., didn't do) was the fourth most common category applied to event reports (6.9%; n = 63 of 917), with 69.8% occurring in the pre-pre analytical phase (n = 44 of 63). Examples of didn't do problems included orders not being released (e.g., a standing order that was not clicked off for completion), tests not being marked as collected, and orders not being discontinued. Five of these events involved default values in laboratory test orders not being updated to reflect the intended test, collection date, or collection frequency.
Although not among the most common categories applied to event reports, the latent patient safety risks associated with new software functionality and system configurations make examination of the general technical category important.5 Collectively, general technical problems accounted for 8.6% of all categories applied to reports (n = 79 of 917).
Problems involving software issues, particularly system configuration and functionality, were the most common, followed by computer systems being down or too slow. Examples of system configuration problems include duplicate order rules misfiring, incorrect test order menus, and incorrect reference ranges. Examples of functionality problems include the laboratory being unable to see all fields of laboratory test orders, nonfunctional forms, and incomplete labs ordered in the emergency department being unable to cross over to the inpatient record. Eighteen events described problems during computer system downtime such as specimen labeling errors and labs not being drawn or processed.
* The details of the PA-PSRS event narratives in this article have been modified to preserve confidentiality.
The majority of event reports that associated health IT with laboratory test problems involved processes occurring before the specimen reached the laboratory. Problems due to wrong inputs and output/display errors were most common. While fewer than 1% of events resulted in patient harm, every event in this analysis had the potential to affect patients and staff in the form of inconvenience, rework, and delay or inaccuracy in the diagnostic process.
The Total Testing Process
Laboratory leaders have historically been on the forefront of quality control and quality assurance efforts, typically focusing on processes within the laboratory to improve efficiency, accuracy, and turnaround.2,3 An expanded definition of laboratory errors that encompasses the continuum of processes, beginning with order placement and extending through communication of results, has been supported by agencies such as The Joint Commission, College of American Pathologists Laboratory Accreditation Program, and the International Organization for Standardization.2-4,8,9
Regulatory requirements have made the monitoring of pre-pre analytical and post-post analytical quality indicators a priority for laboratory leadership.3 This more comprehensive, patient-centered approach introduces new opportunities for improvement, along with governance, measurement, and perspective challenges that only a multidisciplinary team approach can overcome.2,3,10
The distribution of laboratory testing errors has been studied extensively. Variation in the definitions of testing phases makes comparison difficult.3 This analysis applied the phases of testing defined by Plebani to capture the transfer of information between locations, people, and IT systems. Filtering laboratory test problems to only events involving health IT, the results of this analysis align with Plebani's finding that pre-pre analytical errors are most frequent.11
Computerized provider order entry appears to be a particularly vulnerable step in the pre-pre analytical phase. Reasons may include the variety of people completing the task, the frequency of the task, staff training and competency, or reliance on human recall of patient-specific nuances. Orders entered as "nurse collect" instead of "phlebotomy collect," and specimen "collected" instead of "pending collection," were reported. The downstream effects of such errors included electronic collection workflows not being generated, causing specimen collection to be delayed or completely omitted.
The growing complexity of laboratory test menus may be another factor complicating order entry. Test menus on order screens may contain multiple abbreviations, similar-sounding options, inappropriate names, or new, less familiar additions.12-14 Whether built by non-clinicians, natural products of evolution, or with the intent of being accommodating and comprehensive, complex test menus set the stage for confusion, error, and inefficiency.12,13,15 Multidisciplinary committees charged with reviewing, refining, and approving additions to laboratory test menus can facilitate improvements in user experience as well as use and cost.12,16
Specimen Readiness for Processing
Delays in the pre-analytical phase were frequently caused by specimens arriving to the laboratory with incomplete information. Event narratives highlight diverse sources of such error, including interface issues, incorrect orders, unreleased orders, missing requisitions, and label problems. The quality of communication about specimen problems between clinicians and the laboratory was variable—from proactive and timely to reactive and delayed. On the collection end, processes such as specimen time-outs integrated into surgical or clinical checklists may help to ensure information completeness and accuracy before specimen delivery.17 Within the laboratory, procedures for how long to hold specimens with incomplete information can reduce the risk of specimen loss.17 An accompanying communication plan to ensure clinicians are notified of specimen problems in a timely manner may decrease downstream delays in care.
The status of a specimen in the EHR is intended to be a mechanism of tracking and communication. Clear, well defined specimen statuses can prevent delayed recognition of problems, such as waiting hours for the results of a "pending" specimen only to learn it had not been collected. In an article on the benefits and challenges of an interfaced EHR and LIS, Petrides and co-authors recommend that facilities evaluate specimen statuses in order to proactively manage the risk of duplicate orders spurred by confusion.18 In an article by Schreiber and co-authors, two unique instances in which duplicate orders were canceled by the LIS but appeared as pending in the EHR are highlighted.19 Solutions that leverage health IT, such as provider cancellation notices sent through an LIS-EHR interface,19 may reduce the risk of critical tests or irretrievable specimens being lost.
Input problems in the post-analytical phase were far less common than in the pre-pre analytical phase, yet their closer proximity to the end point of the laboratory testing process—clinical decision making, based on results—makes these errors more concerning.11 Event narratives highlighted clerical errors, such as entering positive instead of negative values and time as 12:01 instead of 00:01, which were attributed to drop-down menu misclicks, number inversion, and keying errors. Many facilities have replaced manual entry of results with some degree of autoverification to reduce such errors and free staff to perform more specialized tasks.9,20 It is important to recognize that autoverification is not a flawless solution; quality control remains essential.20 Regardless of the cause, when results must be amended, a process for "immediate and proactive"21 communication with the clinician may mitigate the risk of patient harm.
Intuitive, accurate display of laboratory test results sets the stage for interpretation and action. The variety of ways results are now received by providers and patients—computers, devices, and paper printouts—makes ongoing quality control essential.22 The output/display errors in the post-analytical phase of this analysis represent such challenges. These events highlighted unexpected errors stemming from automatic computer-driven processes, such as autoverification, result flags, reflex orders, and quality control hard stops. Routine monitoring of result outputs, as well as targeted assessments with system upgrades, provides an opportunity to evaluate system performance and connect with end users to identify opportunities for improvement.22,23
Despite mandatory reporting laws, PA-PSRS data is subject to the limitations of self-reporting, including the complexities of selecting the appropriate event subtype and harm score. The indication of health IT as a contributing factor in the event is subject to the reporter's interpretation and understanding of health IT. Analyst ability to categorize events is limited by the information provided in the event report, and reflects the analyst's interpretation of laboratory testing, health IT, and the Plebani and Magrabi taxonomies. The Plebani taxonomy, used in this analysis to differentiate the phases of laboratory testing, does not explicitly outline the role of the patient in the diagnostic process; this may represent a potential future enhancement.
Risk Reduction Strategies
The following risk reduction strategies come from event reports and from the literature:
- Engage a multidisciplinary team to evaluate and improve the laboratory testing process as it relates to health IT.4,10
- Periodically reassess system configuration decisions to optimize processes based on experience using the system.
- Monitor trends in test order errors (e.g., wrong collector, wrong source) to identify opportunities for improvement, such as provider training or order screen configuration.14
- Evaluate and refine the menu of laboratory tests in electronic orders to reduce erroneous and inappropriate selections.12-14
- Customize lab labels to include draw instructions, such as the number and color of tubes.14,24
- Perform a time-out before sending specimens, to ensure all orders, paperwork, specimens, and labels are complete and accounted for.17
- Leverage the power of a bidirectional EHR-LIS interface to allow the laboratory to monitor pending orders and help proactively manage missed or lost specimens.9
- Establish a procedure for handling specimens that arrive to the laboratory with incomplete information, including how long to hold the sample, and a mechanism for communicating the problem with the care team.17
- Evaluate the different specimen statuses that display in the EHR (e.g., pending, collected) to identify opportunities to enhance their clarity and utility.14,18,19
- Create a closed-loop process to communicate specimen cancellations. Consider ways to leverage IT, such as developing the ability to send cancellation notices and acknowledgments across a LIS-EHR interface.13,19
- Use autoverification to automatically send results to the LIS and EHR in order to decrease reliance on manually transcribing results.9,20,22
- Establish a procedure to ensure that amended results are communicated in a timely and proactive manner21 and that the corrected information flows into all appropriate result displays.
- Set up automatic alerts to notify the laboratory when ordering providers do not acknowledge receipt of test results.25
- Conduct routine monitoring of the result displays—in all the different mediums being used by providers and patients—to evaluate information accuracy and utility.22,23 Engage clinicians and patients in the process to gain their perspective and ideas for improvement.
- Include focused assessments of laboratory-result displays as part of system upgrade procedures.22
Well designed and correctly used health IT can add efficiency and accuracy to the laboratory testing process. This analysis demonstrates how flaws in human-computer and machine performance—in all phases of the testing process—can impact the diagnostic process and patient care. The most common vulnerabilities highlighted in this analysis include steps in the pre-pre analytical phase and those throughout the testing process where there is potential for wrong inputs or output/display errors. Facilities seeking to leverage the power of health IT to improve the laboratory testing process may benefit from focusing efforts in these areas of opportunity.
Erin Sparnon, MEng, Engineering Manager, Health Devices, ECRI Institute, was consulted for her expertise and knowledge of health information technology during the development of this article.
- National Academies of Sciences, Engineering, and Medicine. Improving diagnosis in health care. Washington (DC): The National Academies Press; 2015.
- Plebani M. Errors in clinical laboratories or errors in laboratory medicine?. Clin Chem Lab Med. 2006;44(6):750-9. Also available: http://dx.doi.org/10.1515/CCLM.2006.123. PMID: 16729864
- Hawkins R. Managing the pre- and post-analytical phases of the total testing process. Ann Lab Med. 2012 Jan;32(1):5-16. Also available: http://dx.doi.org/10.3343/alm.2012.32.1.5. PMID: 22259773
- Hammerling JA. A review of medical errors in laboratory diagnostics and where we are today. Lab Med. 2012 Feb-Mar;43(2):41-4. Also available: DOI: 10.1309/LM6ER9WJR1IHQAUY.
- Magrabi F, Ong MS, Runciman W, Coiera E. Using FDA reports to inform a classification for health information technology safety problems. J Am Med Inform Assoc. 2012 Jan-Feb;19(1):45-53. PMID: 21903979
- Pennsylvania Patient Safety Authority annual report - 2015. Harrisburg (PA): Pennsylvania Patient Safety Authority; 2016 Apr 29. 98 p. Also available: http://patientsafety.pa.gov/PatientSafetyAuthority/Documents/annual_report_2015.pdf.
- Pennsylvania Patient Safety Authority harm score taxonomy. Harrisburg (PA): Pennsylvania Patient Safety Authority; 2015. 1 p. Also available: http://patientsafety.pa.gov/ADVISORIES/Documents/Tool%20PDFs/201503_taxonomy.pdf.
- Medical laboratories - Reduction of error through risk management and continual improvement. ISO/TS 22367:2008(en). In: Online Browsing Platform (OBP) [internet]. Geneva (Switzerland): International Organization for Standardization (ISO); 2008 [accessed 2018 Mar 14]. Available: https://www.iso.org/obp/ui/#iso:std:iso:ts:22367:ed-1:v1:en.
- Tieman BF. The role of lab automation in reducing diagnostic errors. MLO Med Lab Obs. 2017 Oct;:28,30.
- Plebani M. Laboratory errors: How to improve pre- and post-analytical phases? Biochem Med. 2007;17(1):5-9. Also available: http://dx.doi.org/10.11613/BM.2007.001.
- Plebani M. The detection and prevention of errors in laboratory medicine. Ann Clin Biochem. 2010 Mar;47(2):101-10. Also available: http://dx.doi.org/10.1258/acb.2009.009222. PMID: 19952034
- Passiment E, Meisel JL, Fontanesi J, Fritsma G, Aleryani S, Marques M. Decoding laboratory test names: A major challenge to appropriate patient care. J Gen Intern Med. 2013 Mar;28(3):453-8. Also available: http://dx.doi.org/10.1007/s11606-012-2253-8. PMID: 23192446
- Wilkerson ML, Henricks WH, Castellani WJ, Whitsitt MS, Sinard JH. Management of laboratory data and information exchange in the electronic health record. Arch Pathol Lab Med. 2015 Mar;139(3):319-27. Also available: https://dx.doi.org/10.5858/arpa.2013-0712-SO.
- Petrides AK, Tanasijevic MJ, Goonan EM, Landman AB, Kantartjis M, Bates DW, Melanson SEF. Top ten challenges when interfacing a laboratory information system to an electronic health record: Experience at a large academic medical center. Int J Med Inform. 2017 10;106:9-16. Also available: http://dx.doi.org/10.1016/j.ijmedinf.2017.06.008. PMID: 28870384
- Laposata M, Dighe A. "Pre-pre" and "post-post" analytical error: high-incidence patient safety hazards involving the clinical laboratory. Clin Chem Lab Med. 2007;45(6):712-9. Also available: http://dx.doi.org/10.1515/CCLM.2007.173. PMID: 17579522
- Zutter M, Field J, Bernard G. Improving care and cutting costs: implementation of a laboratory formulary to facilitate better laboratory ordering practices. In: NEJM Catalyst [internet]. Waltham (MA): NEJM Group; 2017 Sep 20 [accessed 2018 Mar 15]. [9 p]. Available: https://catalyst.nejm.org/laboratory-formulary-facilitate-ordering-practices/.
- Heher YK. Specimen almost lost. In: PSNet [internet]. Rockville (MD): Agency for Healthcare Research and Quality, U.S. Department of Health and Human Services; 2017 Nov [accessed 2018 Mar 14]. [9 p]. Available: https://psnet.ahrq.gov/webmm/case/427/specimen-almost-lost.
- Petrides AK, Bixho I, Goonan EM, Bates DW, Shaykevich S, Lipsitz SR, Landman AB, Tanasijevic MJ, Melanson SE. The benefits and challenges of an interfaced electronic health record and laboratory information system: effects on laboratory processes. Arch Pathol Lab Med. 2017 Mar;141(3):410-7. Also available: http://dx.doi.org/10.5858/arpa.2016-0146-OA. PMID: 28234574
- Schreiber R, Sittig DF, Ash J, Wright A. Orders on file but no labs drawn: investigation of machine and human errors caused by an interface idiosyncrasy. J Am Med Inform Assoc. 2017 Sep 1;24(5):958-63. Also available: http://dx.doi.org/10.1093/jamia/ocw188. PMID: 28339629
- Krasowski MD, Davis SR, Drees D, Morris C, Kulhavy J, Crone C, Bebber T, Clark I, Nelson DL, Teul S, Voss D, Aman D, Fahnle J, Blau JL. Autoverification in a core clinical chemistry laboratory at an academic medical center. J Pathol Inform. 2014;5(1):13. Also available: http://dx.doi.org/10.4103/2153-3539.129450. PMID: 24843824
- Janakiraman Mohta V. Amended lab results: communication slip. In: PSNet [internet]. Rockville (MD): Agency for Healthcare Research and Quality (AHRQ); 2012 Feb [accessed 2017 Mar 19]. Available: https://psnet.ahrq.gov/webmm/case/262/amended-lab-results-communication-slip.
- Beckwith BA, Aller RD, Brassel JH, Brodsky VB, de Baca ME. White paper: laboratory interoperability best practices - ten mistakes to avoid. Northfield (IL): College of American Pathologists; 2013 Mar. 32 p.
- Frantz C. Using unofficial lab reports. [internet]. Washington (DC): rican Association for Clinical Chemistry (AACC); 2013 Apr 1 [accessed 2018 Mar 19]. Available: https://www.aacc.org/Publications/CLN/Articles/2013/april/PSF-Unofficial.aspx.
- Le RD, Melanson SE, Petrides AK, Goonan EM, Bixho I, Landman AB, Brogan AM, Bates DW, Tanasijevic MJ. Significant reduction in preanalytical errors for nonphlebotomy blood draws after implementation of a novel integrated specimen collection module. Am J Clin Pathol. 2016 Oct;146(4):456-61. Also available: http://dx.doi.org/10.1093/ajcp/aqw139. PMID: 27686172
- ECRI Institute PSO. ECRI Institute PSO Deep Dive: laboratory testing. Plymouth Meeting (PA): ECRI Institute; 2014 Mar. 58 p.
- Office of the National Coordinator for Health Information Technology. General instructions for the SAFER self assessment guides. Washington (DC): U.S. Department of Health & Human Services, Office of the National Coordinator for Health Information Technology; 2016 Nov. 36 p. (Safety Assurance Factors for EHR Resilience (SAFER).
The Office of the National Coordinator for Health Information Technology (ONC) Safety Assurance Factors for EHR Resilience (SAFER) Self-Assessment Guide, Test Results Reporting and Follow-Up, is a tool for healthcare organizations to evaluate their implementation of strategies to improve the safety and safe use of EHR technology.26