The term “interoperability” has become so commonplace in healthcare and technology circles that everyone seems to recognize its importance, but defining what it is and where the problems lie can be confusing. If you’ve ever heard the term in seemingly unrelated scenarios and scratched your head, you’re not alone.
What is healthcare interoperability and why is it important?
All healthcare facilities have a system of storing information, especially as electronic records have become the baseline standard for housing patient data. Interoperability in healthcare refers to how these different informational systems and applications integrate with one another and exchange data for the purposes of providing useful information to providers, patients, researchers and other stakeholders. A lack of decent interoperability often results in fragmented patient data, leading to medical errors, denied insurance claims, delays in treatment and more. Beyond patient care, it also stalls innovation in medical research and technologies- such as AI tools- when data is insufficient or unreliable.
HIMSS (Healthcare Information and Management Systems Society) divides interoperability into 4 levels. At the foundational level, a system simply must establish the basic requirements to securely communicate and receive information from another. The format and syntax of data (i.e., data field structures) comprises Level 2, while Level 3 ensures that the semantics, such as coding vocabularies and standardized definitions, are utilized. At the organizational level, or Level 4, all workflows and processes are securely integrated with numerous companies and healthcare systems.
What are the main regulatory bodies and policies in place that govern healthcare interoperability?
The Department of Health and Human Services’ ONC (Office of the National Coordinator for Health Information Technology) takes on much of the responsibility for regulating healthcare interoperability, particularly in regards to enforcing certain provisions of the 21st Century Cures Act, a monumental 2016 law that aimed to accelerate and manage medical innovation across various sectors, such as technology, drug development and more. The Centers for Medicare and Medicaid Services (CMS) has also been tasked with enforcing certain provisions, particularly around patients’ access to health records.
Prior to the Cures Act was the HITECH Act of 2009, which promoted the adoption of electronic health records (EHR) by providers. This also gave way to the Meaningful Use program (currently a part of the Merit-based Incentive Program, or MIPS) which provided substantial financial incentives for providers and healthcare systems to replace paper-based records with EHRs that fulfilled specific certification requirements. By 2017, 86% of office-based physicians were using an EHR.
The ONC also oversees other IT-related laws and guidelines, such as Interoperability Standards Advisories, which provide a centralized model for identifying and evaluating critical interoperability concerns, gaps and updates within the healthcare industry. In addition, standards development organizations, or SDOs, are private organizations that must be accredited -typically by the American National Standards Institute (ANSI) or the International Organization for Standardization (ISO)- and provide much-needed guidelines for the various components of interoperability, from semantics to transport, functionality and more.
What are the obstacles preventing interoperability?
While many industries struggle to catch up to the rapidly changing pace of technology, healthcare is a notoriously late adopter. That means that when it comes to interoperability, there are some major barriers that stand in the way.
Standardization is perhaps one of the largest pain points for providers, healthcare organizations and IT vendors. It can refer to a myriad of components, such as consistency around semantics and coding, transfer integration, organizational workflow and more. There are a handful of SDOs whose guidelines have been widely adopted, but gaps and misinterpretations across sectors still persist.
Semantic standardization: Standardizing the way patient records are formatted is costly when not executed clearly, as patients can either be unidentified or misidentified, and duplicate records are often created. These duplicates cost hospitals $1,950 per patient, and 33% of denied insurance claims are a result of incorrect patient information. Even something as simple as the structure of a patient’s address can have a big effect on whether or not they are identified correctly when their health info is being sent to another facility.
Technical (or transport) standardization: Determining how information should be exchanged and the infrastructure of the applications that are sharing data is no small feat. There are some widely adopted data integration standards, such as those outlined by Fast Healthcare Interoperability Resources (FHIR), but guidelines still vary and can leave gaps in interpretation. While achieving perfect interoperability is unattainable, securing a more consistent source of truth for healthcare systems and providers can spur better API development, thereby allowing systems to “communicate” more effectively, provide quick access to information, build innovative tools and more.
Information blocking: In recent years, the ONC has taken on more efforts to combat information blocking by healthcare IT vendors, healthcare systems and providers, but much work remains. Whether it’s due to staffing and resource shortages or simply the need to add an additional revenue stream, organizations have been able to charge other facilities for patient data, erecting another barrier to the free flow of information.
There are many other hurdles impeding the flow of patient health information. For ethical and legal reasons, patient data must be securely stored and shared, and providers who don’t have a comfortable understanding of the technology they’re using may be apprehensive about trying new tools for fear of potential security breaches.
In addition, payment incentives put into place by government policies and programs often have good intentions but may unknowingly make providers’ lives harder if not paired with the right regulations. The Meaningful Use program, for example, incentivized doctors to switch from paper-based records to EHRs, but without explicit rules condemning info-blocking practices, they often ended up with an EHR that couldn’t - or simply didn’t - allow for easy patient data migration when necessary. Many providers and patients still face the repercussions to this day.
What is currently being done to improve interoperability?
Final rules to the 21st Century Cures Act have recently been made in an effort to address information blocking and patient access, encourage robust API usage for data flow, and more.
Specifically, the ONC has prohibited “technology vendors, health information exchanges, and health information networks from practices that inhibit the exchange, use, or access of EHI”, in addition to outlining 8 exceptions to the rule. Criteria for API development was included in the final rule as well. Per the Cures Act, a Trusted Exchange Framework and Common Agreement was also established by the ONC, with the goal of facilitating information exchange between QHINs (qualified health information networks) on a national scale so that health records can easily follow patients wherever they are in the country. Private organizations have also taken matters into their own hands by establishing alliances such as CommonWell, a trade association that works with healthcare IT vendors and organizations for the purpose of promoting interoperability.
The pandemic has also demonstrated the power of health information exchanges, which have been instrumental in gathering and centralizing data about COVID-19 cases. Coupled with some regulatory updates, this year has proved that improved interoperability is possible when enough stakeholders are on board.
On a smaller scale, the shift towards value-based care and the rise of accountable care organizations (rather than a strictly fee-for-service model) have meant that tech vendors are more inclined to adopt their own strategies and practices encouraging data interoperability, such as commitment to an open API, using provider feedback when developing tools and more.