We live in a digital age where almost every human action generates data. Whether via online shopping, a digital step counter, or a simple visit to a provider for a routine check-up, our daily routine leaves a sizeable digital trail through the continuous creation of vast amounts of uniquely personalized data.
For example, in 2022, global human behavior consumed and produced an estimated 94 zettabytes of data, according to global data experts.. While it can be difficult to fully grasp the concept of 94 zettabytes (1 zetabyte = 1 trillion terabytes), a terabyte is a measure of storage capacity roughly equal to the amount of data that can be stored on 1,000 DVDs. So, 94 trillion terabytes would be equivalent to storing the data of 94 quadrillion DVDs.
The Health Information Technology for Economic and Clinical Health Act (HITECH), enacted almost 15 years ago, funded the infrastructure of electronic health record (EHR) systems throughout the country. Today, health-related data’s ever-growing footprint, generated by sources like EHRs, personal fitness applications, lab results, and home genome test kits, is rapidly outpacing that of other industries such as finance and manufacturing.. In fact, data scientists estimate that around 30% of the world’s data volume comes from the healthcare industry.
For healthcare organizations, providers, and patients looking for more healthcare transparency, tapping into this treasure trove of insights can provide a holistic view of a person’s medical history. This presents a tremendous opportunity for industry-wide transformation, including using patient medical records to drive analytic insights, patient journey mapping, predictive analytics, and industry-wide automation.
“However, the challenge lies in leveraging this data at scale, as it is captured in different ways and formats, making it increasingly difficult to aggregate and use effectively.”
Although the exponential growth of healthcare data provides valuable insights into wellness and illness for patients and other players in the healthcare ecosystem, the ability to turn this knowledge into actionable outcomes depends on the quality of the data. Just as a recipe needs fresh ingredients in their appropriate quantities, healthcare organizations, providers, and patients need clinical data that is precise, clean, and consistent to make informed decisions, regardless of the native source or format. However, the challenge lies in leveraging this data at scale, as it is captured in different ways and formats, making it increasingly difficult to aggregate and use effectively.
On average, patients see two primary care providers and seven specialists from four separate practices each year.⁴ Having these providers communicate seamlessly in real time, across multiple workflows and electronic health record platforms that each create data differently makes precision, cleanliness, and consistency challenging.
These communication gaps add to the complexity and fragmentation of the ever-growing body of medical knowledge and medically-generated data. This can lead to higher costs, patient and provider disengagement and diminished care quality through unnecessary tests, medical errors, and adverse health outcomes from poorly managed chronic conditions.⁵
Every year, the United States healthcare system generates one billion health care encounters consisting of unstructured coding variations.⁶ This inconsistent documentation threatens efficiencies and effectiveness of workflows.
Clinical data is typically multi-source and multi-format in nature, complicating semantic interoperability.
To understand the fragmentation issue from a clinical perspective, imagine Jane, a 68-year-old female patient with hypertension, hyperlipidemia, and acute lower back pain. She sees a primary care doctor who fills her prescriptions and does annual wellness visits for her. This primary care doctor generates referrals for routine colon cancer screenings and a cardiologist to evaluate what the primary feels is an increased risk of coronary artery disease.
The cardiologist schedules an echo and a stress test – which are performed. Prior to the follow up with the cardiologist to review results, Jane visits her daughter out of state and during the trip begins to have chest pain, which radiates to her left arm and is accompanied by nausea and shortness of breath. Like many patients, Jane needs to tell her medical story when she visits the emergency room – she needs to recall her medications, allergies, and medical conditions as well as understand her recent testing.
Given geographic distance, the hospital is not able to access her primary care and cardiology records to better drive care. Even if access was possible, each care site uses a different electronic medical record platform. This lack of access and standardization makes Jane’s story, understood and told without complete medical literacy, the foundation of care decisions in the new clinical environment. Jane’s admission to the hospital for a heart attack generates critical data about her ongoing care.
“Even if access was possible, each care site uses a different electronic medical record platform. This lack of access and standardization makes Jane’s story the foundation of care decisions in the new clinical environment.”
Upon release from the hospital, she is given discharge instructions and prescribed new medications. The inpatient cardiologist is thoughtful enough to print out copies of critical tests and treatments for Jane to take back to her cardiologist at home – a total of 80 pages of documentation. While Jane is armed with information in the printed documents, integrating this information is the manual job of clinical and administrative staff at her cardiologist’s office. Data that was entered in structured format has become unstructured and needs to be restructured if it’s going to complete her history. In a typical outpatient visit, abstracting the full story from 80 pages of text is not possible, particularly in the 10 minute duration of the typical outpatient visit. Her cardiologist at home again relies on the story told by Jane about her inpatient hospitalization experience.
This tedious, inefficient process is fraught with significant gaps in knowledge about her health, and can result in poor care and management of her conditions. As time goes on, the fidelity of Jane’s story degrades more and more because the story is being built on omissions, duplications, and inferences of a patient who is not fully medically literate. With the degradation of the clinical story and Jane’s growing medical complexity, it goes without saying that her experience, outcomes, and overall total person health degrades as well.
Jane’s care journey is relatively simple – imagine more chronically ill patients with more subspecialists, in more locations of care, documenting that care in more formats. Or imagine patients who receive care more episodically and inconsistently because of social factors. The degradation of story fidelity and ensuant degradation in care quality happen more rapidly.
With an awareness of this vicious cycle with devastating repercussions, the Centers for Medicare & Medicaid Services (CMS) Interoperability and Patient Access final rule seeks to prioritize patients by ensuring they have access to their electronic medical records, allowing them to share concise yet comprehensive data with family members, primary care and specialty providers, and acute care locations to improve care continuity. Providers and payers are required to provide API access to electronic clinical data to their patients/members to foster better communication.
Recognizing the need to comply and the value of real-time clinical data as a necessary supplement to often delayed and incomplete claims data, many health plans turned to interoperability solutions to facilitate compliance with the CMS Patient Access mandate by the July 1, 2021 deadline. They began to transact rich clinical data from the point of care in the new Fast Healthcare Interoperability Resource (FHIR.) standard with members hoping to generate more holistic understanding of an individual’s interactions with the healthcare system, leading to better decision-making and better health outcomes. If you are interested in learning more about how Health Care Service Corporation (HCSC) upcycles its clinical data with Availity to make it usable and actionable for Patient Access and many digital enabled use cases, read the complete case study.
With an increasing number of plans and providers complying with the generation of shareable FHIR data, it is essential to acknowledge that incomplete or incomprehensive data fails to fulfill the intent of CMS in pushing for FHIR as the standard for data sharing and interoperability. It is crucial for health plans and providers to understand that shareable data does not necessarily equate to accurate or reliable data. For clinical data to be successfully leveraged as a strategic asset, it must be organized, consistent, and complete before it is transformed into the FHIR standard. Raw clinical data from disparate sources typically contains inaccurate codes, syntax errors, and reams of redundant information.
At Availity, our clinical informatics team conducts data quality analyses as part of our deployments, including annual real-world data audits. We routinely find that upwards of 50% of source clinical data cannot be used in its raw native form because of its incompleteness, lack of interpretability or absence of medical concepts. That’s why we believe that an industry commitment to not only data interoperability standards but also applying technology to fix data quality will always be required to ascertain the safety and efficacy of healthcare delivery.
“It is crucial for health plans and providers to understand that shareable data does not necessarily equate to accurate or reliable data. For clinical data to be successfully leveraged as a strategic asset, it must be organized, consistent, and complete before it is transformed into the FHIR standard.”
Clinical data transformed into actionable assets to comply with the Patient Access mandate can and should be leveraged to serve additional digital use cases such as analytics, quality measurement, and reporting.
Today, the pursuit of a comprehensive “golden record” that combines multiple data types, including clinical data, social determinants of health (SDoH) data, behavioral health data, and consumer data, is driving the industry towards a more holistic approach. Building a robust data fabric that integrates all relevant data can provide a prescriptive patient journey, leading to better healthcare outcomes.
The value of clinical data in this pursuit cannot be overstated. Clinical data possesses unique characteristics that make it immensely valuable:
“Developing a long-term data strategy that prioritizes high-quality and interoperable clinical data assets is crucial for healthcare organizations as they work to comply with the CMS Interoperability and Patient Access Final Rule and beyond.”
Developing a long-term data strategy that prioritizes high-quality and interoperable clinical data assets is crucial for healthcare organizations as they work to comply with the CMS Interoperability and Patient Access Final Rule and beyond. A long-term data strategy involves creating a plan to continuously improve data quality, standardize data across systems, and integrate data from various sources. By building on the data assets created for compliance with the Patient Access mandate, healthcare organizations can design a valuable resource that can be used to drive insights and inform decision-making across the healthcare ecosystem.
CMS’s Patient Access and Interoperability final rule proposed in December of 2022, further highlights the need for increased interoperability in healthcare. If passed, it will require the implementation of FHIR standards to streamline prior authorization processes for medical services/items and improve health information access for patients and providers. By creating a comprehensive data strategy that prioritizes data quality and interoperability, healthcare organizations can lay the groundwork for success in the ever-changing healthcare landscape.
Implementing a long-term data strategy in healthcare can profoundly impact patients, their providers, and health insurance plans. Take Jane, for example. She suffers from a chronic condition that requires regular monitoring and care. With a long-term data strategy in place, her provider has access to a complete and accurate medical history, making it easier to make informed medical decisions and provide the best possible care. For Jane, this means her condition is managed more effectively, leading to better health outcomes and an improved quality of life. It also means she spends less time filling out paperwork and answering the same questions repeatedly. By reducing administrative burdens, Jane can focus more on her health and well-being.
“With a long-term data strategy in place, her provider has access to a complete and accurate medical history, making it easier to make informed medical decisions and provide the best possible care.”
For providers, a long-term data strategy can lead to improved care quality, better decision-making, and increased efficiency. Providers can make more informed diagnoses and create personalized treatment plans by having access to complete and accurate medical records. Providers can also save time and reduce costs by streamlining administrative processes. For health insurance plans, a long-term data strategy can lead to cost savings, increased efficiency, and improved customer satisfaction. By leveraging clinical data assets, insurance plans can better understand their patient population, identify areas for improvement, and make data-driven decisions that lead to better health outcomes and reduced costs.
For health insurance plans, one of the many benefits of using high-quality interoperable clinical data assets is that it can enable more advanced analytics and reporting capabilities. By collecting and analyzing data from across the healthcare system, health plans can gain a more comprehensive view of member health and outcomes, as well as identify trends and patterns that might not be apparent through traditional reporting methods. This can help organizations to make more informed decisions about patient care, identify areas for improvement in quality and efficiency, and develop new insights and strategies for managing at risk populations.
In conclusion, implementing a long-term data strategy in healthcare can significantly impact patients, providers, and health insurance plans. By prioritizing data quality and interoperability, healthcare organizations can improve care outcomes, reduce administrative burdens, and drive cost savings. A long-term data strategy also helps healthcare organizations prepare for future changes in the healthcare landscape, such as the Payer-to-Payer mandate.
Discover how a long-term data strategy can unlock the full potential of your healthcare organization and how Availity Fusion™, our API-based technology engine, supports Upcycling Data™ by aggregating data from EHRs, health information exchanges, labs, and other sources and then normalizing and enhancing it to deliver a consistent, standards-based data asset that’s ready for use.
“In conclusion, implementing a long-term data strategy in healthcare can significantly impact patients, providers, and health insurance plans. By prioritizing data quality and interoperability, healthcare organizations can improve care outcomes, reduce administrative burdens, and drive cost savings.”