Study questions effectiveness of EVS to monitor hospital room cleanliness
Click Here to Manage Email Alerts
Recently published findings validate the CDC’s recommendation that independent observers provide the most objective approach to monitoring the thoroughness of environmental services staff to clean and disinfect hospital rooms after they are used.
The CDC recommends that hospital epidemiologists or infection preventionists who are not part of environmental services (EVS) monitor the environmental cleaning and disinfection of hospital rooms, which can reduce transmission of health care-acquired pathogens.
As part of the Benefits of Enhanced Terminal Room (BETR) disinfection study — a large, multicenter randomized controlled trial comparing terminal disinfection strategies — Deverick J. Anderson, MD, MPH, associate professor of medicine at Duke University, and colleagues compared two methods of evaluating the cleanliness of hospital rooms after EVS staff has cleaned and disinfected them. They found a 30% difference in the number of surfaces that were determined by EVS supervisors to be clean compared with surfaces rated clean by research assistants.
In the study, research assistants collected data in 56 rooms and EVS supervisors evaluated 256 rooms at two hospitals. Rooms were not tested concurrently. Rather, Anderson and colleagues matched the 56 rooms evaluated by research assistants with results from 56 rooms in the EVS group. Anderson and colleagues said this allowed them to make “general conclusions” about the results that were strengthened by their strategy of matching rooms by unit, date and time.
They compared the overall proportion of cleaned surfaces and the cleanliness of six specific surfaces: bathroom handrail, door knobs, light switches, toilet seat, sink and chair. Overall, EVS supervisors determined that 82.5% (264 of 320) of surfaces had been cleaned compared with 52.4% (153 of 292) of surfaces as evaluated by the research assistants.
Anderson told Infectious Disease News that the research assistants “didn’t really do anything different” than EVS supervisors. Both groups used fluorescent marks to determine which areas had been properly cleaned by EVS personnel.
“In the end, I suspect the difference is from a few things. First, it is well known that ‘self-policing’ leads to better numbers than from independent observers,” Anderson said, noting studies that have shown this effect in hand hygiene monitoring.
He suggested “human nature” also may have played a role in EVS supervisors crediting EVS staff with effectively cleaning and disinfecting some surfaces.
“I wouldn’t go so far as to say it is because of outright cheating — though it perhaps could be — but I think you’re more likely to give someone credit if you’re on the same team,” he said. “I can’t prove this, but I also suspect the EVS folks chose similar spots. Both groups were instructed to rotate locations — for example, don’t put a marker on the bedside table every time — but I am willing to bet the independent observers did a better job of following those instructions.”
Anderson and colleagues said that if objective monitoring is not feasible, hospitals should consider selective sampling of rooms by external observers as a method to validate EVS monitoring.
“Similar to hand hygiene, external validation of room cleaning improves the validity of cleaning surveillance data. Feedback of validated data to EVS personnel may improve terminal cleaning and decrease the risk of bacterial transmission between patients,” they wrote. – by Gerard Gallagher
Reference:
Knelson LP, et al. Infect Control Hosp Epidemiol. 2017;doi:10.1017/ice.2017.205.
Disclosures: Anderson reports no relevant financial disclosures. One author reports a PDI consultation.