Few patient harm events linked to electronic health record usability
Less than 1% of patient harm events from 2013 to 2016 were attributable to electronic health record usability, data recently published in JAMA suggests.
“Electronic health record usability, which is the extent that EHRs support clinicians in achieving their goals in a satisfying, effective and efficient manner, is a point of frustration for clinicians and can have patient safety consequences,” Jessica L. Howe, MA, of National Center for Human Factors in Healthcare, in Washington, D.C. and colleagues wrote. “Specific usability issues and [electronic health record] clinical processes that contribute to possible patient harm across different health care facilities have not been identified.”
Researchers analyzed 1.735 million free-text patient safety reports from 571 health care facilities in Pennsylvania and an academic health care system elsewhere in the mid-Atlantic. Only reports that utilized one of the top five electronic health products or vendors, and that were classified as “reaching the patient with possible harm” were included.
Howe and colleagues found that of the reported safety events, 1,956 specifically mentioned one of the included EHR vendors or products and were reported as possible patient harm. Also, 557 used words that strongly suggested EHR usability contributed to possible patient harm. Of these latter incidents, 468 reached the patient and could possibly have needed monitoring to preclude harm; 80 could possibly have caused temporary harm; seven could possibly have caused permanent harm and two and might have needed intervention to sustain life or could have led to a fatality.
In addition, usability challenges included data entry (n = 152); alerting (n = 122); interoperability (n = 102); visual display (n = 52); availability of information (n = 50); system automation and defaults (n = 43); and workflow support (n = 36). Usability challenges during EHR clinical processes took place during order placement (n = 213), medication administration (n = 207), results review (n = 87), and documentation (n = 50).

“The analysis was conservative because safety reports only capture a small fraction of the actual number of safety incidents,” Howe and colleagues wrote. “Patient safety reports contain limited information making it difficult to identify causal factors and may be subject to reporter bias, inaccuracies, and a tendency to attribute blame for an event to the EHR.”
Further research is needed to ascertain links between EHR usability, patient harm and how many times the occurrences take place, researchers added. – by Janel Miller
Disclosure: The authors report no relevant financial disclosures.