Paper details benefits, possible risks of AI in pediatric care
Click Here to Manage Email Alerts
Key takeaways:
- Researchers noted AI’s potential in diagnosis and management, among other uses.
- The authors warned that algorithms have the potential to perpetuate bias and must be trained with large volumes of diverse data.
Artificial intelligence, or AI, has shown potential in diagnostics and management in pediatric care, according to a review of the technology published in the Journal of Medical Internet Research.
“I have been a big proponent of technology, innovation and AI to step into gaps but it is important that they are used with data, evidence and safeguards,” Hansa Bhargava, MD, co-author of the study, told Healio.
Bhargava, who is also chief clinical strategy and innovation officer at Healio, said the paper was meant to help clinicians understand specific uses of AI in the field of pediatrics.
In their review, Bhargava and colleagues examined multiple-use cases in which clinicians are applying AI to diagnostics. For example, they mentioned that “a leading children’s hospital has developed a program that uses pattern recognition and real-time data analysis to efficiently diagnose rare diseases in newborns.” Also, an FDA-authorized AI-based diagnostic device uses inputs from health care providers and caregivers, along with video analysis, to help diagnose or rule out autism in young children.
In radiology — “where perhaps the most impressive application of AI in medicine” is happening — the researchers said that convolutional neural networks are able to accelerate the timeline of a life-threatening diagnosis that requires rapid intervention, such as acute ischemic stroke, or a non-life-threatening diagnosis, as in the case of small pneumonia.
The authors also noted that early detection of pediatric sepsis is a “serious concern,” and a study of about 500 pediatric ICU patients showed that AI detected severe sepsis as early as 8 hours before traditional screening methods that relied on electronic medical records.
Beyond diagnosis, the authors examined AI’s potential role in managing adolescent depression and anxiety. They mentioned the mental health software app Woebot, which provides digital cognitive behavioral therapy to users. Additionally, AI is being deployed to reduce opioid dependence, with one AI-powered software — called OR Advisor — helping a children’s hospital reduce opioid administration from 85% of surgeries to less than 1%.
The authors acknowledged the “regulatory, implementation and ethical challenges” in bringing AI to pediatric care. Specifically, they mentioned the need to train “unbiased AI algorithms” on large data sets that are representative of the pediatric population.
“Poorly conceived AI has the potential to cause harm and therefore requires serious clinical consideration and risk identification and mitigation as part of any workflow implementation process,” the authors wrote. “Without careful and equitable sampling, algorithms can potentially perpetuate rather than reduce bias.”
Bhargava said regular and careful assessment of AI in pediatrics is needed “as we as we move into the future.”
“I think that AI is a tool that can be used to better systems, but also just like any other tool, it has its potential pitfalls,” Bhargava said. “I really encourage my colleagues to learn about it, so that we can all work towards a better health care system.”