By Irv H. Lichtenwald, President and CEO, Medsphere
Twitter: @MedsphereH
Earlier this year, Monmouth University conducted a survey to determine which issues were most important as the country transitions to a new presidential administration. Among all the potential concerns Americans now face, the issue that rises to the top is healthcare costs.
How acute a concern is this? It’s significant enough that, when asked the open-ended question, “turning to issues closer to home, what is the biggest concern facing your family right now?”, 25 percent of respondents made it their number one issue.
“It’s also worth noting that issues that have been dominating the news, such as immigration and national security, rank very low on the list of items that keep Americans up at night,” said Director Patrick Murray of the politically independent Monmouth University Polling Institute.
The concerns Americans are voicing about the affordability of care are certainly not misplaced. Overall healthcare costs rose more last August than they have during any month since 1984.
Why do healthcare costs continue to climb?
The answer is complex, as healthcare economics are complex, but certainly Obamacare and industry consolidation and drug prices and a host of other issues can be factored in. Still, one input outweighs all others, according to a recent New York Times piece.
“The real culprit of increased spending? Technology,” health economist Austin Frakt writes. “Every year you age, health care technology changes — usually for the better, but always at higher cost. Technology change is responsible for at least one-third and as much as two-thirds of per capita health care spending growth.”
To be clear, Frakt is talking about all technology, not just healthcare IT. Indeed, in the sheer tonnage of medical technology that currently exists, IT probably makes up a relatively small share, but it does get lumped in the larger group when looking at the direct relationship between tech and costs.
“Eagerness for innovation … seems to have created a culture where medical technologies are adopted prematurely and new medical technology is employed for additional uses beyond the original intent,” writes Mark Mack in a Government Finance Officers Association paper. “In some instances, technologies that offer only marginal improvements over existing treatments — but with dramatically higher price tags — are adopted broadly and rapidly.”
Is Mack talking about the amount of money required to go from paper to silicon? Perhaps not directly, but it has to be included. So let’s, for a moment, look at what our slice of healthcare technology actually does.
Healthcare IT, specifically EHRs, creates reminders and makes records readily available and enables more rapid communication and improves reporting processes. It also enables billing, provides life-saving best practices and cross-references data faster and more effectively than the human mind. All of these things make healthcare more efficient and safer.
And how much should a hospital be willing to pay for these safeguards? It’s a difficult question to answer, given that some hospitals can afford a little and some a lot.
If you’re Kaiser Permanente or Partners, you’re both willing and able to spend billions. Further down the hospital food chain, however, large organizations are spending mere millions to create operational and financial challenges that will hopefully abate over time.
In 2014, for example, Becker’s Hospital CFO reported that Standard and Poor’s had downgraded the credit rating of Wake Forest Baptist Medical Center after the hospital posted a $56 million operating loss in the previous fiscal year due to an EHR purchase. Last year, Becker’s also recognized a 56 percent drop in adjusted income related to MD Anderson’s EHR implementation, and shortly thereafter named seven other hospitals that implemented the same system and faced income shortfalls as a result.
Perhaps we should be asking how much a hospital has to spend to acquire technology sufficient to improve care.
Three-time national hospital of the year Pikeville Medical Center, for instance, can realistically afford quite a bit of EHR. But putting money there means not putting it in expanded services, so hospital decision makers opted for a system that’s comprehensive but less expensive.
And why should we care that health systems are spending billions on complex EHRs if they can afford it? Because healthcare costs don’t exist in the vacuum of one organization. Purchases like these push healthcare costs up for the entire system. Healthcare spending is now 17 percent of the American economy, double what it was in 1980, and the crest of the baby boomer healthcare wave hasn’t hit us yet. Again, these rising costs are the result of many factors, but healthcare technology of all kinds is an exacerbating, not a mitigating, factor.
It doesn’t have to be this way. Healthcare IT in general and EHRs in particular offer a unique opportunity to help moderate costs and provide a host of clinical, organizational and population health benefits by improving processes. But only if the systems don’t require mortgaging the future and putting the hospital in financial straits.
And this would be true even if U.S. healthcare was not moving into a potentially fraught historical period, which we are. The Affordable Care Act is threatened, potentially eliminating more than 20 million insured patients. Meaningful Use payments for healthcare IT adoption will be going away. Payment models are shifting to value instead of services.
Moving forward, healthcare IT must ensure that the tools we create become catalysts for slower growth of costs or (gasp!) even a reversal. Where technology has historically driven up the costs of healthcare, we need to make IT the exception.
This article was originally published on Medsphere and is republished here with permission.