EHRs, digital imaging and the slow evolution of interoperability

September 02, 2019
by Sean Ruck, Contributing Editor
Hospitals and doctors are required to use electronic health records or get hit with financial penalties under Medicare reimbursement rules. While the rules have been around for years and most providers are following them, it doesn't mean it's been easy or that data improvement is where it needs to be today.

EHRs first started to appear in academic medical centers in the late 60s and early 70s. By the 1980s, they were in wider use at hospitals across the country. While digital imaging essentially emerged at the same time as EHRs, the technologies did not synch with their parallel evolutions.

Part of the problem is the fact that interoperability, even among the same type of technology, hasn't met the expected or promised standards. Different vendors have had different levels of success, but that's easy to understand. After all, putting a lot of resources into ensuring your tech plays well with a competitor's technology while having no assurances that they're doing the same, is a tough pill to swallow.

Dr. Chris Roth
Dr. Chris Roth, associate professor of Radiology, vice chair Information Technology and Clinical Informatics, director of Imaging Informatics Strategy at Duke Health explained the even greater interoperability challenge that's found between imaging informatics and EHRs. "Compared to many other areas of medicine, Interoperability in imaging is unique because you're dealing with a unique imaging format and communications standard in DICOM," he says. Fortunately, many in the imaging community are working to bridge the data divide between imaging and the EHRs. When you consider the spectrum of clinical multimedia in radiology, cardiology, ophthalmology, obstetrics, the emergency department, operative suites, dermatology and other medical specialties, the opportunity is significant.

Providers require data integration between technologies and between specialties to provide holistic patient care. These technologies may be fixed diagnostic scanners, while some are mobile. Most providers have the ability to take a patient photo or video, integrate it into the electronic medical record, and assign some straightforward metadata to it using the Integrating the Healthcare Enterprise Radiology's Encounter Based Imaging Workflow (IHE EBIW) profile. But this profile requires build work within and between institutions to create appropriate clinical workflows and interoperable data. "When capturing images, medical professionals often still disagree with the terms to use to describe the body part," says Roth. The HIMSS-SIIM Enterprise Imaging Community is tackling this challenge to determine the optimal ontology to consistently describe body parts. Roth continues, "Pretend there is an open fracture of the upper arm. The big bone in an X-ray of the upper arm is called the humerus. Taking a photograph of the broken bone sticking out of the same upper arm, the body part might be described as arm or skin. True interoperability means the X-ray and the photo need to relate to each other in computer terms, and we must agree on those terms, so providers can consume them in the EHR together. Hospital business and informatics leadership should point their EHR imaging terms in this interoperable direction."

So it's clear that the technology at many sites isn't exactly delivering what's needed for interoperability with imaging. But how well are imaging departments complying? In a broad sense, not too well. Part of the blame lies with the vendors, with different vendors having different ways of labeling very similar techniques and different ways of capturing and incorporating images into the systems, meaning the task is exponentially harder the more systems the radiologist has to work with. Part of the blame lies with physicians and hospitals permitting imaging metadata variation.

Different scanner manufacturers call nearly identical sequences by different names. Different scanners within a hospital may or may not have similar imaging metadata, depending on who configured the scanner, the scanner age, and how long ago those metadata were last reviewed. Until recently, there was no ability to centrally modify and standardize the image data capture configurations on many devices at once. Integrating the Healthcare Enterprise Radiology recently created a profile called Management of Acquisition Protocols (IHE MAP) so that a leader technologist and physician can jointly and centrally review and modify the configurations of the entire scanner fleet, permitting consistent metadata capture, even across scanner vendors. "Most hospitals around the country keep scanners for more than five years, often up to 10 years," Roth says. "Some scans will always only have older sequences, and thus older metadata describing them unless the institution puts in the effort to standardize how they capture scans. IHE MAP can assist with this data standardization and should be requested of scanner manufacturers during the RFP process as a play toward imaging data interoperability."

The inconsistencies create at least one additional source of friction. AI, which relies on accurate and consistent data as its lifeblood, suffers. A radiologist who verbally dictates in a less-structured way what they're seeing, either due to limited technology or to preference, isn't adding clean and clear data. On the other hand, a radiologist with current technology and a mind for innovation may use more structured terms computers can understand.

The tools are slowly being developed, but the question is whether they'll reach their potential anytime soon. After all, there's been about half a century to work out the kinks.