In 2014, Vermont Information Technology Leaders (VITL), operator of the Vermont Health Information Exchange (VHIE), undertook a successful initiative to improve data exchange with industry stakeholders as well as acquire more complete and accurate patient information for quality measures reporting.
VITL CTO Michael Gagnon, shares insights from their journey.
Question: To kick off our conversation, tell us a little about VITL and the organizations it serves.
Michael Gagnon: VITL is a nonprofit organization tasked with advancing health care reform efforts in Vermont through use of health IT. We are the legislatively designated operator of the VHIE, which is a secure, statewide data network that enables electronic exchange of patient information and access to needed data. We currently collect and manage a variety of patient data including demographics, lab results, discharge summaries, radiology reports and medication histories. This information comes from hospitals, physician practices, Federally Qualified Health Centers, home health, long term care, designated agencies and commercial labs. Once a patient provides consent, information is made available in the VHIE network to authorized providers to help them make more informed clinical decisions at the point of care. As part of our mission, we are also assisting healthcare providers with the adoption and use of advanced infrastructures to support and improve quality care initiatives.
Q: Please describe the how the need for the data quality project was identified.
MG: You hear a lot about the role of big data and analytics as it pertains to supporting the big picture of value-based care today. Healthcare organizations are trying to identify the best infrastructures for data exchange without a lot of guidance. As such, many investments miss the mark by not addressing the foundational components needed to support the complete and accurate capture of data on the ground level. In reality, most organizations are missing key pieces of their patients' or members' health data. Also, existing data is not standardized or normalized to support accurate, meaningful analytics reporting.
Our goal with this project was to design and implement a new end-to-end data quality model to elevate quality reporting and advance population health initiatives. We were approached by several organization members—including an accountable care organization (ACO), and the state-wide patient centered medical home (PCMH) to help them address data collection and quality problems. VITL was essentially tasked with overcoming the complexities of centralizing, and then cleaning and managing, data quality across a large and diverse group of organizations. Like many health networks, there are dozens of versions of EHRs and clinical information systems being used across the hundreds of healthcare organizations in Vermont, each with varying data coding and exchange capabilities. Without the right infrastructure and strategy in place, our members faced significant barriers to achieving their overall population health or financial goals.
Q: How did VITL go about laying the needed infrastructure to support clean data capture for meaningful analytics?
MG: VITL already had an HIE technology infrastructure in place for successfully transmitting information to be used at the point of care, however the solution was not sufficient for higher-level analytics needs. To get the desired result, we invested in a number of solutions including an integration engine from Orion, terminology management solution from Health Language, data warehouse solution from Microsoft, a separate master patient index (MPI) and business intelligence tools from Tableau. Our team also built gateways and filtering mechanisms to identify populations within the PCMH and ACO and established a mechanism for parsing disparate continuity of care documents down to the individual element level.
Our strategy address three stages of data quality:
1. Data quality at the source, addressing people, workflows and reporting. VITL partnered with member organizations to teach them how to accurately capture data on the front-end.
2. Data quality in the middle, leveraging a specialized MPI and comprehensive terminology management tools to clean and normalize data for analytics and reporting.
3. End-point analysis, closing the feedback loop by having the organizations performing the analytics assess their data analytics needs and report back to VITL what is needed.
Q: What results have you achieved?
MG: The ACO is now collecting data on over 75,000 beneficiaries which is greatly improved by the new data management and quality framework. By normalizing lab, drug and care summary from hospitals, the ACO reports on: HbA1C (ACO measures 22 and 27) and blood pressure (ACO measures 24 and 28).
Q: What recommendations can you offer to other organizations in their quest for clean, accurate data exchange?
MG: First, health networks must garner an understanding of the scope of effort they are undertaking. This process was complex, and its success was intricately linked to an understanding of organizational nuances and the disparate systems that existed within and outside the network. For instance, we had to consider how to manage information across more than 70 EHR systems alone.
Also, HIEs specifically need to consider political barriers that may exist to funding initiatives to support data quality. If VITL had been more proactive in educating state oversight groups regarding its role in master data management, the effort could have moved forward more quickly.
Healthcare organizations should not allow the complexities of these initiatives to overwhelm them into non-action. Practical steps can be taken with small amounts of data at a time to improve the outlook. Advanced technologies and tools exist to aid organizations with these efforts, and the business case for investing in solutions is often an easy one to make.