Guest Column | July 31, 2019

EHRs Add To IT Headaches For Healthcare Organizations

By James D’Arezzo, Condusiv Technologies

Avoiding The Headaches Of Asset Management In Pharma

The use of electronic health records (EHRs) is supposed to be making healthcare safer and more efficient. Many experts and clinicians say digital patient records are instead compromising safety, lowering efficiency, burdening clinicians and straining maxed-out IT infrastructures.

While there are national mandates to address and solve the lack of EHR interoperability, the most urgent problem is the limitations of healthcare IT infrastructures. Without IT systems that can process and analyze the growing amount of healthcare data, efforts to improve interoperability will be largely futile.

About 80 percent of healthcare IT systems are Windows-based. The Windows Operating System has an inherent structural issue that can result in serious performance deficiencies when trying to process and analyze significant data. This can lead to major IT system slowdowns and even crashes.

Since healthcare depends on getting the right data at the right time to the right people, these issues can impact life or death situations.

For the last 30 years, healthcare organizations have moved toward digital patient records, with 96 percent of U.S. hospitals and 78 percent of physician’s offices now using EHRs, according to the National Academy of Medicine. A recent report from market research firm Kalorama Information states that the EHR market topped $31.5 billion in 2018, up 6 percent from 2017.

Ten years ago, Congress passed the Health Information Technology for Economic and Clinical Health (HITECH) Act and invested $40 billion in health IT implementation.

Despite that investment, the lack of EHR standardization results in a high number of discrepancies in duplicate patient records. For example, the same patient may be in multiple healthcare organizations’ EHR systems, but with different or missing information including names, addresses and social security numbers.

A 2018 Black Book survey found that EHR matching has a significant effect on hospital spending, workflow and patient safety.

This lack of interoperability among the more than 700 EHR providers results in a national healthcare system that is digitally disconnected.

According to a recent survey of 110 senior healthcare leaders by Dimensional Insight and HIMSS Analytics, healthcare organizations are struggling to process an increasing amount of digital data and are frustrated by the number and variety of analytics tools they are forced to use.

And the amount of data is only going to increase. According to a recent report from International Data Corporation, the volume of data processed in the overall healthcare sector is projected to increase at a compound annual growth rate of 36 percent through 2025, significantly faster than in other data-intensive industries such as manufacturing, financial services and media and entertainment.

Impact On HIT Systems

A healthcare organization’s IT function is affected by the tsunami of data as well as the EHR issue of duplicate records.

For example, records for one patient may be scattered in multiple places, requiring database searches and analytics that can aggravate or help cause input-output (I/O) performance degradation. Data analytics requires a computer system to access multiple and widespread databases, pulling information together through millions of I/O operations. The system’s analytic capability is dependent on the efficiency of those operations, which in turn is dependent on the efficiency of the computer’s operating environment.

Performance degradation, which can lower the system’s overall throughput capacity by 50 percent or more, happens in any storage environment. This is because Windows penalizes optimum performance due to server inefficiencies in the handoff of data to storage. As the storage layer has been logically separated from the compute layer and more systems are being virtualized, Windows handles I/O logically rather than physically, meaning it breaks down reads and writes to their lowest common denominator. This creates tiny, fractured, random I/O that results in a “noisy” environment.

Left untreated, this only worsens with time.


Even experienced IT professionals erroneously think that new hardware will solve these problems.

I/O degradation is a software problem, one for which relatively inexpensive software solutions exist. Dealing with it does not require (and is not helped by) major investments in new computational and storage hardware.

About The Author

James D'Arezzo is CEO of Condusiv Technologies, a global provider of software-only storage performance solutions for virtual and physical server environments.