By Dr. Jay Anders, Chief Medical Officer, Medicomp Systems
Twitter: @MedicompSys
Host of Tell Me Where IT Hurts – #TellMeWhereITHurts
NLP has dominated the news cycle with Microsoft’s recent deal to Acquire Nuance for $19.7 billion. With Google and Amazon still at play in healthcare NLP, more changes are sure to come. But what is NLP, how useful is it in healthcare, and what are its drawbacks?
In the second episode of Tell Me Where IT Hurts, I talk to Dr. Tim O’Connell and explore how NLP is improving clinical documentation, driving actionable insights, and helping reduce clinician burnout. Together, we also take a candid look at its shortfalls.
Episode NOW on Demand
NLP is transforming healthcare
Dr. O’Connell says NLP brings numerous benefits to healthcare. It can do everything from scan a patient’s chart and nursing notes for early complications to identify and link medical slang to single concepts to improve care quality and efficiency. Regarding medical slang, he says emtelligent is training its language models to create multiple links to the same concepts.
Breaking it down further, Dr. O’Connell says, NLP is a technology that reads, understands, and structures human prose (e.g., physician documentation). In healthcare, for example, it enables clinicians to access specific patient information without having to sift through reams of data. In another example, it allows a medical researcher studying knee pain across a patient population to quickly identify patients who have had MRIs without reading lengthy physician notes.
While NLP is groundbreaking in healthcare, I point out that the technology is still poorly understood with confusion on how NLP and AI work together. Dr. O’Connell says to think of AI as a broad umbrella term for a computer that mimics things that humans do, like read notes, understand language, and communicate with one another. NLP, on the other hand, is more specific: It is the branch of computer science in which computers read and understand both human and computer produced text.
Voice recognition technology: Still waiting for the “magic” to happen
Dr. O’Connell also weigh in on the promise and limitation of voice recognition technology. I say while voice recognition technology is often described as “magical” in that it is promised to capture an entire exam room conversation, it has not yet fulfilled that promise. Dr. O’Connell, concurs, saying the technology still raises concerns over patient privacy, security, and accuracy. “NLP enables a deep learning engine to distill recorded conversation that is verbatim into what it thinks is most important,” he says. The trouble is it doesn’t always pick up verbal nuances or communication implied through body language, which a human can easily interpret.
“I don’t see the mic in the sky approach working very well in the current context of patient care and the patient visit,” I agree, noting that body language alone can convey how a person is feeling.
On the other hand, as a radiologist who dictates notes daily, Dr. O’Connell sees NLP’s strengths and weaknesses. While there are a lot of issues with speech recognition and accuracy, physicians must be able to both type and dictate their notes. For example, he says, NLP is very adept at handling complex narratives in cases where patients have pancreatic cancer. NLP can describe a pancreatic tumor in an infinite number of ways, creating a structured note that is accurate 100 percent of the time, something that would be near impossible when typing up a note.
Medicomp and emtelligent’s unique collaboration
We also discuss the promising partnership between Medicomp and emtelligent to deploy an NLP engine that encodes clinical concepts, diseases, and clinical relationships. The Medicomp MEDCIN platform is designed to better understand unstructured medical text. “We have a great platform here together and I think it is incredibly powerful for anyone who wants to roll up the next generation of healthcare applications today,” says O’Connell.
We agree that as the technology progresses across the two companies, to improve clinician documentation and provide intelligent prompts, it will also help alleviate clinician burnout. I think some physicians are looking to get out of medicine because they are tired, burned out.
What’s next for NLP and healthcare IT?
In closing, while deep-pocketed mega corporations are entering the NLP space, Dr. O’Connell is bullish on the future and says emtelligent has the distinct advantage of building an NLP engine that caters to clinicians, something the bigger IT companies have not yet achieved. What’s the No. 1 change healthcare IT needs to make? Listen to the full podcast to find out and to learn more about the future of NLP in healthcare and how it will play an even greater role in clinician documentation.
This article was originally published on the Medicomp blog and is republished here with permission.
About the Show
On Tell Me Where IT Hurts, join host Dr. Jay Anders as he sits down with experts from across healthcare and technology to discuss ways to improve EHR usability for end users. Dr. Anders and his guests explore opportunities to enhance clinical systems to make them work better for clinicians, reduce burnout, maximize revenue potential, and drive better patient care outcomes. Join the conversation on Twitter at #TellMeWhereITHurts.