There has been considerable discussion in the press over the past week regarding a proposed Centers for Medicare and Medicaid Services rule that would mandate information exchange for all organizations participating in the program.
The ongoing discussion has certainly stirred the industry up and it’s only just beginning.
Responses have been received from entities such as the American Hospital Association, the Electronic Health Records Association, the Association Medical Informatics Association, and most notably, from my perspective, CHIME who stated that:
“A distinction must be drawn between speeding and increasing data exchange among providers and achieving a true state of interoperability. The two should not be conflated.”
Interoperability comes in a continuum ranging from the relatively simple problem of data transmission through to semantic exchange where both parties understand the information being exchanged in its full context. CHIME’s point is that we should not focus on the furthest left point of the continuum as being an acceptable answer.
True semantic interoperability is not widely available at this time, although efforts are being made to improve upon this situation through standards such as FHIR. As Stan Huff of Intermountain Healthcare has pointed out, even FHIR, in and of itself, is not sufficient to guarantee semantic interoperability, as knowing you are exchanging LOINC codes for a specific data element still leaves considerable variability in which codes can be used and what they are intended to mean.
So, given that a “true state” of interoperability is probably not achievable in the short to medium term what should CMS be doing to guarantee interoperability meets our expectations and their latest efforts are not just another Meaningful Use program?
First, I believe there should be more focus on the mandates of the 21st Century Cures Act to prevent information blocking. This is only partially a technical problem, as the business problems that must be solved are considerable.
At my company, eHealth Technologies, we specialize in retrieving records from providers and as such come across information blocking on a daily basis. We’ve even encountered situations where two providers sitting across the road from each other will not exchange information via fax lines and we’ve been engaged to ensure critical patient data is liberated from this impasse.
These kinds of situations show that exchange is not all about technology. The reality is that if you can’t achieve fax line transmission, your chances of achieving semantic interoperability are minimal at best.
Second, I believe we should focus more on existing technology such as the Direct project to exchange data before we look to FHIR enabling the entire IT infrastructure of this country. Many providers have direct addresses and can already move consolidated CDA documents around successfully. The technology exists – it’s real, and we paid a lot of money through the Meaningful Use program to get it.
Finally, we need to let TEFCA develop. The Trusted Exchange Framework and Common Agreement lay out a new approach to data exchange by setting up qualified health information networks (QHINs) that offer a simple on-ramp for other entities to join in national exchange. It’s a voluntary system that does not require Health Information Networks (HINs) to become QHINs. However, ONC is recommending that this rule becomes mandatory for providers, payers, and states and will continue to push it hard.
Will using existing technology, taking down information blocking, and focusing on a national exchange program be enough to ensure we move in a measured way towards semantic interoperability? I think so, and would welcome your comments!