Fact checked byHeather Biele

Read more

May 02, 2024
2 min read
Save

Natural language processing could detect suicide risk missed by typical screenings

Fact checked byHeather Biele
You've successfully added to your alerts. You will receive an email when new content is published.

Click Here to Manage Email Alerts

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.

Key takeaways:

  • Fifty-eight percent of users flagged for possible suicidal ideation did not indicate it on health assessments.
  • Users received crisis resources within minutes, and a response team followed up within hours.

Natural language processing software identified possible suicidal risk among patients who had not indicated suicidal ideation through typical health screenings, according to study results.

“At NeuroFlow, we have been sounding the alarm on the mental health crisis since before the pandemic,” Tom Zaubler, MD, MPH, chief medical officer at NeuroFlow, the proprietor of the digital behavioral health platform used in the study, and one of the study’s principal investigators, told Healio. “Our study design was about understanding the current shortcomings of screening for suicidal ideation, particularly in a primary care setting. Clinicians cannot keep up with the necessary amount of patient screenings without technology, and the hypothesis is that artificial intelligence and natural language processing can automate some of these processes that can be taxing on clinicians.”

Tom Zaubler, MD, MPH

The retrospective database study, published in Innovations in Digital Health, Diagnostics and Biomarkers, included 425 patients (women, n = 316; mean age, 41.67 years) from health care settings that had been using NeuroFlow’s digital behavioral health platform and who had been flagged by its natural language processing AI for expressing suicidal ideation between Jan. 8, 2020, and Aug. 9, 2023.

NeuroFlow’s platform works by detecting language associated with suicidal ideation in free-form text entries — in this case, journaling exercises — and within 2 minutes sends an email to the user with crisis support resources. The users’ text prompts are sent to a response team who evaluates the alert and contacts patients using suicide prevention strategies. Users continue receiving digital support resources for 4 days following the alert.

Researchers evaluated participants’ medical histories from 30 days prior to the alert to review their Patient Health Questionnaire-9 (PHQ-9) responses regarding suicidal ideation.

Of the 425 patients flagged for possible suicidal ideation, 344 had completed a PHQ-9 assessment within 30 days before the alert. Roughly half (n = 177) of those participants indicated suicidal ideation on the PHQ-9, with a mean score of 18.45, and 167 did not (mean score, 8.42).

The median time from alert to response team follow-up was 2 hours 12 minutes during normal business hours and 12 hours 27 minutes during off-hours.

“The biggest takeaway was that treatment as usual was undercounting the number of patients with suicidal ideation; 58% of NeuroFlow users whose suicidal ideation was detected by natural language processing may not have been identified otherwise,” Zaubler said. “When you think about the statistic that NLP identifies suicide risk among a large cohort of individuals who otherwise were not endorsing concerns about safety, and extrapolate to a national scale, it really feels like a wake-up call.”

Zaubler added that clinicians should harness the power of AI and consider transitioning to a technology-based approach to risk screening.

“Early detection of suicidal ideation or depression symptoms can help clinicians prevent escalation, meaning less clinical burden,” he said. “The emergence of artificial intelligence

shouldn’t be met with friction; this is a tool that can and will save lives and empower clinicians to deliver more impactful care. We need to continue to study its role in larger environments, but the potential is truly limitless.”

Reference: