May 29, 2018
2 min read
Save

Automated surveillance for VAEs more accurate, curtails human error

You've successfully added to your alerts. You will receive an email when new content is published.

Click Here to Manage Email Alerts

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.

Study findings showed that an automated system that uses an algorithm to detect ventilator-associated events, or VAEs, in hospitalized patients is more accurate and efficient than traditional manual surveillance performed by infection control staff, who are prone to human error, researchers said.

Ventilator-associated pneumonia is a very serious problem that is estimated to develop in up to half the patients receiving mechanical ventilator support,” Brandon Westover, MD, PhD, a physician in the department of neurology at Massachusetts General Hospital (MGH) and director of the hospital’s Clinical Data Animation Center (CDAC), said in a news release.

Westover and colleagues developed an algorithm to retrospectively review ventilated ICU patients at Massachusetts General Hospital to identify VAEs and compared the results with surveillance conducted by infection control staff. According to results published in Infection Control & Hospital Epidemiology, the automated system was 100% accurate at identifying at-risk patients when supplied with necessary data.

“In our study, manual surveillance made many more errors than automated surveillance,” Erica S. Shenoy, MD, PhD, an infectious disease physician at MGH and hospital epidemiology lead for CDAC, said in the release.

According to Shenoy, errors included false-positive results, misclassification of VAEs as more or less serious than they actually were and failure to detect and report cases that met criteria upon closer inspection. “In contrast, so long as the necessary electronic data were available, the automated method performed perfectly,” she said.

The algorithm was tested in a development cohort from January to March 2015 and debugged before being used in a validation cohort from January to March 2016.

To determine whether criteria were met for a VAE — and if so, at which CDC-defined level — the algorithm analyzed physiologic data to detect increases in positive end-expiratory pressure and the fraction of inspired oxygen, queried the electronic health record for leukopenia or leukocytosis and antibiotic initiation data, and retrieved and interpreted microbiology reports, the researchers said.

Manual surveillance, which included reviewing data recorded by a respiratory therapist, was conducted by certified infection control staff from the hospital’s infection control unit with a combined 30 years of experience, they said.

Among 1,325 admissions in the development cohort, there were 479 ventilated patients, 2,539 ventilator days and 47 VAEs. In the validation cohort, there were 1,234 admissions, 431 ventilated patients, 2,604 ventilator days and 56 VAEs, the researchers reported.

According to the results, the sensitivity and specificity of automated surveillance in the development cohort were both 100%, and the positive protective value (PPV) was 100%. In the validation cohort, sensitivity was 85%, specificity was 99% and PPV was 100%. By comparison, with manual surveillance, sensitivity was 40%, specificity was 98% and PPV was 70% in the development cohort. In the validation cohort, sensitivity was 71%, specificity was 98% and PPV was 87%.

PAGE BREAK

“In summary,” the authors concluded, “we have developed a fully automated VAE surveillance system with opportunities for increased accuracy of surveillance, the potential to improve patient care processes and outcomes, and the assessment of interventions aimed at enhancing care and reducing complications of mechanical ventilation.” – by Gerard Gallagher

Disclosures: The authors report no relevant financial disclosures.