Hand hygiene is widely recognized as the most important foundational aspect in reducing Healthcare Associated Infections.
Measuring clinicians’ hand hygiene performance rates is critical to being able to address this problem. Until recently, the only way to do this was through Direct Observation - having “mystery shoppers” in hospital hallways make notes of whether or not clinicians performed hand hygiene.
DO was considered the “gold standard” of hand hygiene monitoring because it was really the only way to measure performance. Despite the widely accepted importance of hand hygiene, performance rates remain a problem for nearly all hospitals. Let’s be realistic; if Direct Observation was the answer, the nationwide average wouldn’t still be below 50%.
Today, technology has replaced DO as the gold standard. The new electronic hand hygiene monitoring systems overcome many of the significant limitations that have consistently plagued DO. Here are the top five reasons that Direct Observation is no longer the gold standard of hand hygiene monitoring.
1. The Hawthorne Effect. The Hawthorne Effect is a well-documented phenomenon whereby clinicians that know they’re being watched are up to three times as likely to sanitize their hands. The “secret shoppers” doing DO are never really secret and clinicians easily figure out when they’re being observed. As a result, they do what they are supposed to while being watched, and then quickly return to their traditional patterns as soon as the observer goes away. This means that hospitals reporting on data from DO are falsely overstating their true hand hygiene rates. Our data has shown that this impact artificially inflates hand hygiene rates by up to 300%, and this has been confirmed by others.1 Hospitals are simply relying on inaccurate data and it’s leading to a dangerous, false sense of security.
2. Small sample size. A typical 30-bed hospital unit will experience around 75,000 hand hygiene opportunities a month. Obviously, it’s impossible for a human being to observe all of these. In fact, most hospitals with robust DO programs will report capturing only around 60 hand hygiene opportunities per month, which represents less than 0.08% of the total. This sample size is so small that it can’t be extrapolated to represent the bigger picture.
3. Observation bias. Human beings are subject to unintentional biases in their ability to recall events. These are based on their conscious decisions, and also on their own subconscious beliefs. As it relates to hand hygiene, clinicians simply remember what they expect to see. If a hospital’s program is focused on seeing when people perform hand hygiene, observers will unintentionally and disproportionally record when they see people washing their hands and not recall opportunities when hand hygiene didn’t occur. We commonly see that an observer who has a personal belief that physicians are less likely to sanitize than nurses ends up unintentionally recognizing those patterns. What people see and what they record can often be two very different things.
4. Inability to see into rooms. It’s difficult for observers to see inside a room under the best of circumstances. Unless someone is directly in the line of sight looking all the way in the room, she can only see the clinicians for a short period and not for the entire patient encounter. As a result, the observer will either disproportionally record times when she’s able to see people performing hand hygiene right by the door, or tend to ignore times when people don’t because she just can’t see. Either way, this artificially skews the numbers. Additionally, some of the most important opportunities for hand hygiene occur behind closed doors and there is no way DO can address this. What’s the first thing clinicians do when performing a sterile procedure or changing a dressing? Sadly, it’s usually not perform hand hygiene, but close the door.
5. Inherent catch-22. With DO, it’s impossible to both record accurate data and improve results. Hospitals don’t use CT scans to treat fractures, nor a thermometer to treat sepsis, yet DO is usually intended to both be the monitor and solution for almost all hospitals. If the observer is to remain anonymous, she can’t correct a clinician that doesn’t perform hand hygiene. What’s the point of monitoring a problem you can’t solve and failing to change behavior? If, on the other hand, the observer does correct the clinician, her anonymity is compromised and the data she’s collecting is artificially inflated because of the Hawthorne Effect.
It’s time to recognize that new technological breakthroughs have raised the standard to a point that surpasses DO’s ability. The old standard of DO isn’t working anymore; all it’s doing is giving hospitals a false sense of security. Electronic hand hygiene monitoring systems don’t fall victim to any of the five issues listed above. And the best systems go beyond simply monitoring to provide real-time reminders to change behavior. They provide a solution for hospitals to finally address the problem of hand hygiene.
1. Srigley, J.A., et al., Quantification of the Hawthorne effect in hand hygiene compliance monitoring using an electronic monitoring system: a retrospective cohort study. BMJ Qual Saf, 2014. 23(12): p. 974-80.
2. Hagel, S., et al., Quantifying the Hawthorne Effect in Hand Hygiene Compliance Through Comparing Direct Observation With Automated Hand Hygiene Monitoring. Infect Control Hosp Epidemiol, 2015. 36(8): p. 957-62.
Chris Hermann, PhD, is the Founder and CEO of Clean Hands – Safe Hands. Dr. Hermann started and led the multi-institution research collaboration that developed the core technology utilized in the CHSH system. Over the last 11 years, he led investigators from Children’s Healthcare, Georgia Tech, Emory School of Medicine, the GA Tech Research Institute and the Centers for Disease Control and Prevention. Dr. Hermann earned a PhD in Bioengineering, a MS in Mechanical Engineering, a BS in Biomedical Engineering with High Honors from the Georgia Institute of Technology and is an MD candidate at Emory School of Medicine.