There are signs that partisanship in Washington may be waning.
Interoperability, the need to share medical information from disparate electronic health record systems, is uniting at least two Senators from either side of the party divide. Republican Lamar Alexander of Tennessee calls efforts to exchange health information "a glaring failure." He goes on to say, "It's a great idea; it holds great promise. But it's not working the way it's supposed to. The current standards for Meaningful Use aren't clear. Upgrades are expensive. The systems don't work to share the data; we hear it's expensive to share the data because of some of the relationships between vendors and doctors."
Democrat Elizabeth Warren of Massachusetts expresses anxiety about mismatching patient information even if EHRs become more interoperable. She cites a RAND Corporation study that estimates hospitals mismatch patient information about 8% of the time even with data management software and personnel dedicated to solving the problem.
Frankly, neither critique is particularly helpful. Everybody knows we are in the midst of a health data explosion. Information that could theoretically be stored in 10 billion four-drawer cabinets only three years ago will require 500 billion four-drawer cabinets in just five years. Would either Senator or anybody else suggest that all of this data doesn't need to be shared?
Yes, it has been frustrating. Yes, a lot that has been tried hasn't worked or hasn't been implemented quickly enough. And, yes, there have been errors.
Let's face it: Interoperability is hard. We don't always know where data is located. Some data is structured; some data is non-structured. Not all data has been digitized. Definitions of data vary depending on the source. The information is complex and has to somehow be exported from humans to data warehouses. And on top of it all, there are regulations and requirements from the government which sometimes clarify and sometimes confuse.
Interoperability becomes even harder to achieve because the incentives to achieve interoperability have been missing. Back in the day, vendors were incentivized to lead the paper-to-electronic gold rush. They invested time and money on software solutions to capture data. In a fee-for-service world, the paper-to-electronic push was well worth the effort.
But while there was plenty of incentive to capture the data there was little incentive for providers, organizations and vendors to share data. Why share with a competitor? Especially when retention of this data could provide a competitive advantage?
Vendors want to secure healthcare data, to make sure their clients are locked into their platform. To ensure this, each of these vendors uses proprietary models and terminology to represent clinical data.
Ownership of the data is so tightly held that dictionaries are not consistent vendor to vendor, or even organization to organization within the same vendor. Standards of nomenclature and vocabulary are vague, so sharing data means that each useful application interface must be created or re-created on each different platform, rendering application integration costs higher than they need to be.
Consider the impasse that has limited interoperability between inpatient and outpatient systems. In the data conversion and integration projects on which we at Galen Healthcare Solutions have worked, coalescence of data is always an issue. For instance, if a provider wishes to push data back from outside sources into the EMR, the following questions crop up all the time:
- Will the whole data set be pushed, or just a subset?
- Will the data cause duplication for records that already exist? Are there matching criteria for the data?
- How is the identity of the patient resolved?
- How are the underlying vocabularies and nomenclatures translated?
- How is the data reconciled to the patient's chart?
- Has the patient opted in for that data to be shared?
- Can I trust the validity of the data?
- What concerns around liability do I have if I ingest and store the data?
Because these matters remain unresolved, inefficiency in the effort to achieve interoperability is inevitable. Without proper patient matching, without being able to harmonize inpatient and outpatient data, and without real trust between those at the source and those at the destination, interoperability cannot succeed. As a result, we find that the wheel is being reinvented on a daily basis – with the same interface deployed over and over again. We also find that when provider organizations switch EHRs or interface engines, they ultimately duplicate the amount of spending they already incurred. By migrating to a new system they risk leaving behind the data they worked so hard to capture, in addition to losing all of the IP (and money) they invested in a legacy system.
At the same time our frustrations with the pace of interoperability are exacerbated by expectations that are too great. Our sense of urgency has obscured clarity. Significant progress towards interoperability will occur but first we must have a more realistic, more clearly defined understanding of what interoperability means.
At Galen we define true interoperability as more than just data exchange. We define it as the availability of complete contextual information at the point of care, where it can be shared across clinicians, lab, hospital, pharmacy, and patient to permit clinical decisions -- regardless of the application or application vendor. In short, interoperability is care coordination, the delivery of insights and information quickly, efficiently and securely, to facilitate the real-time query of document-based and discrete data from the point of origin, where it is stored, to the point of use, such as another EHR, patient mobile device or population health registry.
Interoperability is liable to remain the "glaring failure" Senator Alexander has called it, so long as we cling to outmoded ways of thinking and doing. We must acknowledge and embrace the paradigm shift that is transforming the way care is delivered. Fee for service has had its day. We're moving on. The public wants, policy demands it, and logic commands payment models based on value delivered, not services rendered. And, the essential glue that will render value-based care possible is coordinated care.
How will this accelerate interoperability? We believe it requires a broadcast and subscription model – one that operates in much the same way as the Internet: a query for information about a patient is transmitted and quickly acknowledged with a promise to provide what is needed and no more. Only the subset of information that is requested is packaged together real-time and coalesced within the existing chart to be presented at the point of care.
However, while the means to achieve true data liquidity in healthcare are well understood, the incentives remain murky. The only way to achieve interoperability in healthcare is through clarity in incentive models that transgress traditional competitive data silo and blocking methods.
Arguably, we cannot leave it up to the vendors, nor the providers to be incentivized to share data. Rather, the fate of interoperability lies in the hands of the consumer – the patient.
As the costs of healthcare continue to rise, financially-burdened patients will demand the same information liquidity they have come to enjoy in other industries, the kind of consolidated data they need to make informed and economical decisions. EHR vendors be warned.
Justin Campbell is Vice President, Strategy at Galen Healthcare Solutions.
The views, opinions and positions expressed within these guest posts are those of the author alone and do not represent those of Becker's Hospital Review/Becker's Healthcare. The accuracy, completeness and validity of any statements made within this article are not guaranteed. We accept no liability for any errors, omissions or representations. The copyright of this content belongs to the author and any liability with regards to infringement of intellectual property rights remains with them.