By Christy Green, Director of Strategic Consulting &
Cheryl Mason, MSHI, Senior Manager of Strategic Consulting and Clinical Informatics, Health Language
Twitter: @Health_Language
Today’s hospital and health system executives cannot help but hear the clamor to advance data-driven care delivery models focused on proactive management of populations. Pressed by a sense of urgency to position for the future of risk-based reimbursement arrangements, executives can often overlook a cornerstone element to success with analytics: data normalization. In fact, it is not uncommon for a health system to invest millions of dollars in an analytics solution, only to realize later that the reports being produced are inaccurate or meaningless due to missing or poor quality data.
How can healthcare executives avoid this situation?
Recognizing the need for clean, accurate data is easy; achieving the end goal can be challenging. Any effort to lay the needed foundation for meaningful analytics begins with a full investigation of the data sources and the data to be used. Today’s health systems are often managing 40 or more clinical, claims and administrative systems—all with their own inherent terminology infrastructure. From a time and resource standpoint, there are notable complexities to understanding each system’s unique characteristics. Simply put, it’s tedious work.
While multiple systems across the enterprise are likely capturing the same or similar information, they are storing the data using different terminology standards or proprietary codes. Thus, normalizing all this data, relating it so that it can be effectively analyzed, can be a monumental challenge. The first step is creating a single source of truth for standardizing data in disparate systems. In addition, health systems will need to map proprietary codes to standards as well as between standards to support analytics.
[rosterslider id=’10’]
Foundationally, the solution lies in effective enterprise terminology management. Unfortunately, CIOs and health IT departments are overwhelmed with the task of data normalization amid a growing list of other high-level IT initiatives. Merger and acquisition trends across the industry further exacerbate the issue as new systems and platforms are continuously introduced to a health system’s greater IT strategy. One health system managing six different EHRs recently struggled with laying a foundation to simply normalize the name of a hospital across its system.
The complexities of managing clinical terminologies are much greater. For instance, consider an effort to track hemoglobin A1c values across a multi-institutional health system. The same test may be referred to as “HbA1c” by one institution, “A1c” at a second and “glycosylated hemoglobin” at a third. All of these terminologies must be normalized to an industry standard—in this case LOINC—to ensure regulatory compliance, accurate data analytics and interoperability on a level that enables data exchange with an unambiguous, shared meaning.
Today’s health systems need resources and consultation to help design a big picture blue print for reaching their analytics goals. For many organizations, the business case for leveraging expert third-party consultation and the inherent benefits of an advanced enterprise terminology management platform is an easy one to make. Tools exist that automate the data normalization process and mediate the differences between disparate classification systems. By automating the incorporation of standard clinical terminology into healthcare software applications through real-time auto-mapping and integration technologies, data can be safely exchanged and accurately analyzed. Ultimately, the guesswork is removed from complex terminology management processes.
[rosterslider id=’11’]
[rosterslider id=’12’]