GLENDALE, CA, July 25, 2019 — The U.S. healthcare system is unable to consistently match patients with their records, largely because of the widespread adoption of electronic health records (EHRs). Not only does this endanger patients and waste money, it strains already-overburdened IT infrastructures, according to Condusiv Technologies CEO James D’Arezzo.
“In the half century since the electronic medical record technology was introduced, hospitals and doctors have used hundreds of EHR vendors, each of which is constantly updating its technology,” said D’Arezzo, whose company is the world leader in I/O reduction and SQL database performance.
With more than 700 EHR providers, there are too many opportunities for inconsistency, D’Arezzo said.
For the last three decades, healthcare organizations have been moving toward digital patient records. Ten years ago, Congress passed the Health Information Technology for Economic and Clinical Health (HITECH) Act and invested $40 billion in health IT implementation.
Unfortunately, progress has been elusive. According to a recent investigative report by Kaiser Health News and FORTUNE, “Rather than an electronic information ecosystem, the nation’s thousands of EHR systems remain a sprawling, disconnected patchwork. Unlike, say, the global network of ATMs, the proprietary EHR systems don’t talk to each other, meaning that doctors still resort to transferring medical data via fax and CD-ROM.”1
One result is that the average hospital has a 20 percent duplicate patient record rate, costing the U.S. healthcare industry $6 billion annually.2 According to a 2018 survey by Black Book Market Research, an average of 18 percent of patient records are duplicates. Matching rates between organizations—such as between a doctor’s office and a hospital—can be extremely low. Even when they share the same EHR vendor, variable data entry protocols can drop match rates to as low as 50 percent.
This can have serious consequences. In 2016, two patients in Worcester, Mass. with the same name had kidney scans on the same day. Their records got mixed up, resulting in the erroneous surgical removal of a perfectly healthy kidney.3
In addition to endangering patients, record duplications can impact a healthcare organization’s accreditation. In the U.S., the Joint Commission, an organization responsible for the accreditation and certification of healthcare organizations, accredits about 77 percent of the nation’s hospitals. Among the areas evaluated by the Joint Commission is patient safety, on which the accuracy and reliability of EHRs have a direct bearing. The majority of state governments recognize Joint Commission accreditation as a condition of licensure for the receipt of Medicaid and Medicare reimbursements.4
Duplicate records also impact a healthcare organization’s IT function. Multiple searches for scattered versions of duplicate information—pulling a record for Jonathan Smith, John Smith, Jon Smith and J. Smith, all with the same address and date of birth—further burden an IT infrastructure already stretched to capacity. This can aggravate or even cause input-output (I/O) performance degradation, which affects the system’s ability to perform database searches and analytics. Such degradation can lower the system’s overall throughput by 50 percent or more.
This performance degradation is particularly common in Windows-based systems due to an inherent structural issue in the operating system itself. Around 80 percent of the installed base of IT systems in the healthcare arena are Windows-based, which means that I/O performance degradation is extremely common.
“Given the importance of computation and data management to healthcare organizations, there is a temptation, even in an era of strained budgets, to throw money at the problem in the form of new hardware,” D’Arezzo said.
“There’s good news and bad news,” he added. “The bad news is that new hardware won’t help because I/O degradation is a software problem. The good news is that relatively inexpensive software solutions exist. Condusiv Technologies, the world leader in this area, provides solutions that—without the need for additional investment in hardware—can, in the Windows environment, improve overall system throughput by 30 percent to 50 percent or more.”
About Condusiv Technologies
Condusiv® Technologies is the world leader in software-only storage performance solutions for virtual and physical server environments, enabling systems to process more data in less time for faster application performance. Condusiv guarantees to solve the toughest application performance challenges with faster-than-new performance via V-locity® for virtual servers or Diskeeper® for physical servers and PCs. With over 100 million licenses sold, Condusiv solutions are used by 90% of the Fortune 1000 and almost three-quarters of the Forbes Global 100 to increase business productivity and reduce data center costs while extending the life of existing hardware. Condusiv Chief Executive Officer Jim D’Arezzo has had a long and distinguished career in high technology.
Condusiv was founded in 1981 by Craig Jensen as Executive Software. Jensen authored Diskeeper, which became the best-selling defragmentation software of all time. Over 38 years, he has taken the thought leadership in file system management and caching and transformed it into enterprise software.
1. Schulte, Fred, and Fry, Erika, “Death by 1,000 clicks: Where electronic health records went wrong,” Fierce Healthcare, March 28, 2019.
2. LaRow, Mark, “How Patients Suffer from The Worst Kept Secret in Health IT,” HIT Consultant, June 11, 2019.
3. Butcher, Lola, “American Health Care Is Flooded with Duplicate Medical Records,” The Atlantic, March 22, 2019.
4. “Facts about Hospital Accreditation,” jointcommission.org, September 12, 2018.
For more information, visit https://condusiv.com
Follow us on Twitter and Like Us on Facebook