Baltimore-based Johns Hopkins Hospital spent more than $5.6 million on quality metric reporting in 2018, according to a study published June 6 in JAMA Network.
A team of researchers from Johns Hopkins University set out to calculate how much time and money were devoted to measuring and reporting quality metric data to government and national healthcare rating organizations. Their findings are based on interviews conducted in 2019 with hospital personnel involved in quality reporting activities. The study did not include resources spent on quality improvement efforts at the hospital.
Johns Hopkins Hospital reported 162 unique quality metrics — 97 of which are applicable to hospitals nationwide — in 2018. Preparing and reporting the data took an estimated 108,478 hours, costing $5,038,218.28. An additional $602,730.66 was spent on vendor fees, according to the study. Claims-based metrics were the most resource-intensive of all metric types, while electronic metrics required the least.
"Clinicians are dedicating more time to reporting quality and less time for patient care, and hospitals are allocating resources to regulatory compliance instead of patients," Ge Bai, PhD, study author and healthcare accounting expert at Johns Hopkins Carey Business School, said in a statement to Becker's. "Many quality metrics are duplicative, expensive to generate, and offering no tangible benefit for care improvement."
The study suggests policymakers, quality metric designers and hospital executives consider the costs associated with reporting and develop avenues for reducing the burden, such as reducing the overall number of metrics and investing in electronic metrics.
Separately, a recent survey of 43 infection control experts at U.S. hospitals found they question the effectiveness of certain healthcare-associated infection measures when it comes to making improvements and protecting patients. The survey was led by researchers at the Baltimore-based University of Maryland School of Medicine and published inJAMA Network Open.
Respondents indicated that many metrics, such as surgical site infections and antibiotic-resistant bloodstream infections, were important measures to report. But most respondents indicated two metrics related to sepsis management and ventilator-associated infections were not useful measures. Moreover, 84 percent of respondents said they believe HAI rates reported to CMS are manipulated by hospitals or staff.
"Some [patients] have infections that can't be prevented, while other metrics we are required to report aren't indicative of an infection and don't lead to an improvement in the quality of care that patients receive," lead study author and assistant professor of medicine at UMSOM Gregory Schrank, MD, said in a news release. "Our survey found that tracking these metrics can detract from other important infection prevention work."