Guest Column | October 12, 2017

Fighting Fire With FHIR: Driving Healthcare Interoperability

HIE Effectiveness Questioned As Vendors Align For More Interoperability

Fast Health Interoperability Resources (FHIR): improving interoperability and provider access to data.

By Gavin Robertson, CTO and SVP, WhamTech

Healthcare providers are dealing with more data than ever and there are a growing number of technologies that are gathering and storing that information. The problem facing the healthcare industry is how to make the different sources of data communicate with each other. It can be difficult for providers to access all the necessary and most up-to-date information held by insurance providers, hospitals, and clinics using a variety of electronic health records (EHRs), patient portals and databases. With information coming from so many different sources with incompatible formats, data access, sharing and interoperability are rendered nearly impossible.

Providers are trying to get to the next step beyond conventional Healthcare Information Exchanges (HIEs) that usually exist as single application to single application (P2P) data formats. While models like the HL7 standard data model have improved how data is managed, there are still serious problems that must be addressed. For example, the HL7 standard data model adoption – it is complex as there are usually a variety of implementation variations that tend to be relational and require further transformation of the data.

Challenges facing interoperability remain, even with the HL7 standard data model improvements, which include:

  • A myriad of vendors have a variety of ways to structure and retain the same basic data
  • The ever-more complex world of data requires multiple applications and data sources
  • Lack of data quality and the ability to update data sources, especially in real-time
  • Lack of the single patient view, or a unified standard data view from multiple data sources

FHIR (Fast Health Interoperability Resources) APIs, with an HL7 base, is an example of one effort to establish data source access standardization. However, vendor systems still have a long way to go as it represents the data in a different way from most data schemas, such as object vs hierarchical data representation. Additionally, FHIR APIs sometimes require access to a range of tables within a single data source or multiple data sources.

A common technique for implementing FHIR APIs is to adapt results data on-the-fly from data source schemas to the FHIR object model, however this does not address many of the aforementioned inadequacies. Another solution is to physically copy and transmute data into data stores that are FHIR-friendly. This method enables data services and improves the process of converting formats. Both of these methods introduce other problems, such as data aging, weakened security and privacy, lessoned interactivity and instantaneous updates, breaks in storage and systems, as well as heightened implementation time and cost.

New technologies, such as WhamTech SmartData Fabric™ (SDF), have begun to surface that address most of these challenges. These emerging technologies offer an original substitute for the previously mentioned approaches, by using an indexes-based approach to execute high quality and high-performance query access by applications and through FHIR API services, which keeps the data in the original sources allowing results to be obtained as needed.

No matter the approach, FHIR APIs enable interoperability while increasing capabilities greatly. More modern workflows can be implemented directly from power end-user applications, such as new smartphone app-driven BPM ones that work against FHIR API services, updating in real-time, and on a variety of legacy data sources across many establishments. Additionally, Hybrid Cloud implementations are another great example where many data sources are combined in the cloud and on premise.

While organizations try to find a way to establish interoperable systems, they must do so while ensuring that the data is secure. It is essential to be able to manage huge amounts of data in a secure way due to new regulations and the huge rise in attacks on healthcare organizations’ data. Finding solutions for securely managing healthcare data could mean keeping data in its original source, while enabling high performance queries that execute against indexes and enforce almost no load on the sources themselves. Other important parts of the solution to consider are generating additional data, processing events, linking visualization of data in graphs, and maintaining uninterrupted automatic unification of MDM for a comprehensive single view of data across the organization, such as patient data from a variety of sources and at different federation levels. The future of technology, and specifically healthcare technology, will need to minimize coding while maintaining easy usability and a customer-friendly configuration. Achieving true interoperability will come down to technologies that can function across multiple systems and platforms in a secure manner.

As healthcare providers collect ever-more data from a variety of sources, technology will have to aid physicians in easily analyzing that information, without having to dig through the digital masses, and in an efficient and secure way. Technology should be helping obtain improved insights in less time and cost. Watching these interoperability solutions being implemented to solve one of the biggest healthcare problems, makes it an exciting time to be in the industry because ultimately these technologies have the potential to create better patient outcomes, reduce costs and increased satisfaction for everyone.

About The Author

After earning a BSc (Hons) in Chemical and Process Engineering and a MEng in Petroleum Engineering both at Heriot-Watt University, Edinburgh, Scotland, Gavin spent more than 15 years in the domestic U.S. and international oil and gas industry, often working with data-related projects. After a period as a consultant to WhamTech, Gavin joined as CTO and Senior VP in 1999, responsible for product design and development, and technical sales and marketing. Gavin’s Twitter Handle is: @WhamTech_CTO.