Newsletters
October 2025

What You Need to Know

​​Impact of Artificial Intelligence on Patient Safety Events: Preliminary Exploration of Events Reported to the PA-PSRS Database B

​​Artificial intelligence (AI) is reshaping healthcare and expected to have adramatic impact on patient safety as AI is further developed and refined.1-7 AI is expected to improve patient safety in key areas, such as by increasing efficiency of clinical decisions and care, 2,4,6 reducing human error, 2,3,6 offering risk prediction and early detection of change in patients’ condition, 2-4 supporting system-level safety, 2,4 and offering insights by aggregating many data sources. 2,4 Despite the optimism surrounding AI, the following patient safety concerns have been raised: AI models that are trained on biased or incomplete data, 1,2,4,6-8 staff become overly reliant on and biased toward the recommendations provided by AI, 1,6,8 staff not trusting the “black box,” 2,6,7 AI has given erroneous recommendations for individual patients with incomplete and/or inaccurate records,6,8 staff are overwhelmed by being given too much information/notifications,2 and AI-based tools that are implemented prior to sufficient testing and validation.4,6-8

PA-PSRS Reports of Events Involving AI

Given the concerns about AI and patient safety, we conducted a preliminary exploration of the Pennsylvania Patient Safety Reporting System (PA-PSRS) for events that involve AI, either causing the event or preventing/detecting the event. Based on a limited sample of event reports, we found that AI was primarily involved in monitoring of patients for exiting their bed, interpretation of data collected from patient monitoring devices, reading of images (e.g., X-ray, CT), and note dictation. We found that events were occurring at both small and large hospitals, and across a range of care areas (e.g., cardiovascular unit, emergency department, general medicine, imaging, orthopedic, surgical unit).

In most of the event reports within our sample, AI was used to prevent/detect issues. For example, AI was involved in numerous events where a human failed to detect a significant image finding (i.e., false negative), but AI detected the misread. In other instances, AI analyzed patient behavior to predict/detect a patient exiting their bed, out of staff’s concern for them being at risk for either a fall or disorientation (the AI technology is more advanced than a traditional bed alarm). In these bed related events, the reports frequently described scenarios where the AI technology alerted staff to a prediction that the patient was planning to exit their bed, but staff were still unable to reach the patient before their fall. Among the instances where AI was used to prevent/detect issues, a majority of the event reports described AI as being proactively implemented, but in some events it was described as being reactively implemented with the goal of preventing the same issue from occurring again.

We also identified limited instances where AI caused or contributed to the events by either misreading an image or a monitoring device where a new AI program began producing a much greater quantity of information that caused staff to be overwhelmed with notifications and this reportedly delayed their identification of an urgent finding.

Future Directions and Conclusion

Numerous sources have warned the healthcare community that, despite the intended benefits of AI, it could create a broad range of risks to patient safety. 1,2,6,7,10 Despite the potential risks, we have identified very few PA-PSRS reports that describe AI as a cause or contributing factor to patient safety events. However, AI-related events may be underdetected and their true frequency could be higher than what is reflected in our current data.For example, AI involvement may not be recognized during monitoring and analysis of PA-PSRS reports if reporters do not mention “artificial intelligence,” “AI,” or the name of an AI-enabled software/device. 11 Similarly, reporters may not always recognize when AI is involved, such as when staff use an AI-enabled software/device but are unaware that AI was being used or that the design of AI was somehow contributing to the event.10 Lack of understanding how AI contributed to an event likely will result in important information being absent from the patient safety event report. Finally, events involving AI might also be underreported in more nuanced circumstances, such as when AI-enabled technology (i.e., clinical decision support, CDS) provides a nonoptimal recommendation or an erroneous recommendation that the clinician follows, but is not identified until a much later date.

Leaders and staff at healthcare facilities need to be vigilant in detecting and reporting patient safety events involving AI. 9,10 The reporting of events will enable the Patient Safety Authority to identify statewide patterns and share lessons learned across Pennsylvania.

Call to Action: Report AI-Involved Events to PA-PSRS

When submitting a report to PA-PSRS, include “artificial intelligence” or “AI” in the event narrative, along with the name of the software and/or device, and describe the full context of the event. Ensure that the report identifies the contributing factor(s) and outcome. This information is vital for the Patient Safety Authority to monitor how AI is impacting patient safety and to deliver actionable insights back to facilities.

References

  1. Ratwani RM, Bates DW, Classen DC. Patient Safety and Artificial Intelligence in Clinical Care. JAMA Health Forum. 2024;5(2):e235514-e. doi:10.1001/jamahealthforum.2023.5514
  2. Tighe P, Mossburg S, Gale B. Artificial Intelligence and Patient Safety: Promise and Challenges. Agency for Healthcare Research and Quality, US Department of Health and Human Services. https://psnet.ahrq.gov/perspective/artificial-intelligenceand-patient-safety-promise-and-challenges. Published March 27, 2024. Accessed September 11, 2025.
  3. Bates DW, Levine D, Syrowatka A, et al. The Potential of Artificial Intelligence to Improve Patient Safety: A Scoping Review. NPJ Digit Med. 2021;4(1):54.doi:10.1038/s41746-021-00423-6
  4. Classen D, Longhurst C, Thomas E. Bending the Patient Safety Curve: How Much Can AI Help? NPJ Digit Med. 2023;6(2):1-3. doi:10.1038/s41746-022-00731-5 ​
  5. Gottlieb S, Silvis L. Regulators Face Novel Challenges as Artificial Intelligence Tools Enter Medical Practice. JAMA Health Forum. 2023;4(6):e232300. doi:10.1001/jamahealthforum.2023.2300
  6. Ross P, Spates K. Considering the Safety and Quality of Artificial Intelligence in Health Care. Jt Comm J Qual Patient Saf. 2020;46(10):596-9. doi:10.1016/j.jcjq.2020.08.002
  7. Garcia-Gomez JM, Blanes-Selva V, Romero CA, et al. Mitigating Patient Harm Risks: A Proposal of Requirements for AI in Healthcare. Artif Intell Med. 2025;167:103168. doi:10.1016/j.artmed.2025.103168
  8. Challen R, Denny J, Pitt M, et al. Artificial Intelligence, Bias and Clinical Safety. BMJ Qual Saf. 2019;28(3):231-7. doi:10.1136/bmjqs-2018-008370 9. Hose B-Z, Handley JL, Biro J, et al. Development of a Preliminary Patient Safety Classification System for Generative AI. BMJ Qual Saf. 2025;34(2):130-2. doi:10.1136/bmjqs-2024-017918
  9. Handley JL, Krevat SA, Fong A, Ratwani RM. Artificial Intelligence Related Safety Issues Associated With FDA Medical Device Reports. NPJ Digit Med. 2024;7:351. doi:10.1038/s41746-024-01357-5
  10. ​U.S. Food and Drug Administration. Artificial Intelligence-Enabled Medical Devices. FDA. https://www.fda.gov/medical-devices/software-medical-devicesamd/artificial-intelligence-enabled-medical-devices. Reviewed July 10, 2025. Accessed September 11, 2025.

Lessons From Event Reports:

Reporting a Problem Catches Unrecognized Systemwide Failure

At around 10 p.m. on a Sunday, a registered nurse reported to her unit director that her telemetry pager was not receiving alarms for a patient with arrhythmias, although it had been working when she came on the night shift at 7 p.m. When her pager also didn’t receive the test page she sent, she test paged all the other RN’s pagers on her unit and discovered none of them were working. She stationed an RN at the telemetry monitor to watch all the patients being monitored and checked with the nursing supervisor. Although none of the other units had reported an issue, this was only because they hadn’t noticed their pages weren’t receiving alarms until the RN brought the problem to their attention. She notified clinical engineering of the hospitalwide outage, which rebooted the system; pager alarms were functioning again by 12:30 a.m. The RN’s attentiveness and diligence ensured that no critical alarms went unnoticed and unaddressed during system downtime and quickly resolved a serious problem that no one would have been aware of otherwise.