Fact checked byRichard Smith

Read more

July 31, 2024
2 min read
Save

Generative AI could provide echocardiogram reports suitable for patient education

Fact checked byRichard Smith
You've successfully added to your alerts. You will receive an email when new content is published.

Click Here to Manage Email Alerts

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.

Key takeaways:

  • Generative AI may be able to provide acceptably correct explanation of patients’ echocardiography results.
  • Echocardiographers agreed that most explanations were suitable without need for editing.

Explanations of echocardiography findings written using ChatGPT may be suitable to educate patients about their health status and reduce physician workload, according to data published in JACC: Cardiovascular Imaging.

Due to the federal mandate for immediate release of test results to patients through the 21st Century Cures Act in 2016, clinicians have encountered rising numbers of inquiries from uneasy patients unable to understand the results of echocardiograms. Dissemination of accurate information is further bottlenecked by electronic health record protocols, Lior Jankelson, MD, PhD, associate professor of medicine at the NYU Grossman School of Medicine and AI leader for cardiology at NYU Langone, and colleagues wrote.

Robot finger typing on a laptop
Generative AI may be able to provide acceptably correct explanation of patients’ echocardiography results. Image: Adobe Stock

The researchers therefore evaluated the utility of generative AI to explain echocardiogram reports to patients.

“Our study, the first to evaluate GPT4 in this way, shows that generative AI models can be effective in helping clinicians to explain echocardiogram results to patients,” Jankelson said in a press release. “Fast, accurate explanations may lessen patient worry and reduce the sometimes-overwhelming volume of patient messages to clinicians.”

The researchers used ChatGPT with Generative Pre-trained Transformer 4 (GPT4) to generate explanations of 100 echocardiograms taken at NYU Langone Health. The echocardiograms were subsequently graded by five echocardiographers using 5-point Likert scales for acceptance, accuracy, relevance, understandability and representation of quantitative information.

The median length of the AI-generated explanations was 1,186 characters, which was longer than the conclusions from echocardiogram reports and half the length of the full reports, according to the researchers.

Reviewing echocardiographers either agreed or strongly agreed that 73% of AI-generated explanations were good enough to send to patients without edits, and 84% of explanations were rated as “all true” and the remainder were “mostly correct.”

In addition, echocardiographers categorized 76% of AI explanations as containing “all of the important information,” 15% with “most,” 7% with “about half” and 2% with “less than half” of all relevant information.

Moreover, the researchers stated that none of the incorrect explanations nor those with missing information were designated “potentially dangerous” by the reviewing echocardiographers.

“If dependable enough, AI tools could help clinicians explain results at the moment they are released,” Jacob A. Martin, MD, MSCR, cardiology fellow at NYU Langone, said in the release. “Our plan moving forward is to measure the impact of explanations drafted by AI and refined by clinicians on patient anxiety, satisfaction and clinician workload.”

The researchers noted that although echocardiographers rated most AI-generated explanations as understandable for patients, the average U.S. adult reads at an eighth grade level, and the AMA and CDC recommend targeting patient educational material to a sixth grade reading level. The present AI model achieved a seventh grade reading level, according to the study.

Reference: