Microsoft and Nuance unite in journey to help Doctors to turn their concentration back to patients.

Microsoft and Nuance partnership | Office.com/setupOffice.com/setup
Updates about Microsoft

Imagine a visit to your doctor’s office in which your physician asks you how you’ve been feeling, whether your medication is working or if the shoulder pain from an old fall is still bothering you — and his or her focus is entirely on you and that conversation.

The doctor is looking at you, not at a computer screen. He or she isn’t moving a mouse around hunting for an old record or pecking on the keyboard to enter a diagnosis code.

This sounds like an ideal scenario, but as most people know from their own visits to the doctor, it’s far from the norm today.

But experts say that in an exam room of the future enhanced by artificial intelligence, the doctor would be able to call up a lab result or prescribe a new medicine with a simple voice command. She or he wouldn’t be distracted by entering symptoms into your electronic health record (EHR). And at the end of the visit, the essential elements of the conversation would have been securely captured and distilled into concise documentation that can be shared with nurses, specialists, insurance companies or anyone else you’ve entrusted with your care.

A new strategic partnership between Microsoft and Nuance Communications Inc. announced today will work to accelerate and deliver this level of ambient clinical intelligence to exam rooms, allowing ambient sensing and conversational AI to take care of some of the more burdensome administrative tasks and to provide clinical documentation that writes itself. That, in turn, will allow doctors to turn their attention fully to taking care of patients.

Of course, there are still immense technical challenges to getting to that ideal scenario of the future. But the companies say they believe that they already have a strong foundation in features from Nuance’s ambient clinical intelligence (ACI) technology unveiled earlier this year and Microsoft’s Project EmpowerMD Intelligent Scribe Service. Both are using AI technologies to learn how to convert doctor-patient conversations into useful clinical documentation, potentially reducing errors, saving doctors’ time and improving the overall physician experience.

“Physicians got into medicine because they wanted to help and heal people, but they are spending a lot of their time today outside of the care process,” said Joe Petro, Nuance executive vice president and chief technology officer. “They’re entering in data to make sure the appropriate bill can be generated. They’re capturing insights for population health and quality measures. And although this data is all important, it’s really outside a physician’s core focus on treating that patient.”

Primary care doctors spend two hours on administrative tasks for every hour they’re involved in direct patient care, studies have shown. If they don’t capture a patient’s complaint or treatment plan during or shortly after an exam, that documentation burden will snowball as the day goes on. In another recent study, physicians reported one to two hours of after-hours work each night, mostly related to administrative tasks.

This shift to digital medical record keeping and so-called ‘meaningful use’ regulations is well-intentioned and has provided some important benefits, said Dr. Ranjani Ramamurthy, senior director at Microsoft Healthcare who leads the company’s EmpowerMD research.

People no longer have to worry about not being able to read a doctor’s handwriting or information that never makes it into the right paper file. But the unintended consequence has been that doctors are sometimes forced to focus on their computers and administrative tasks instead of their patients, she said.

After starting her career in computer science, Ramamurthy went back to school to get a medical degree and pursue cancer research. But as she walked the halls of the hospital every day, she couldn’t help thinking that she was missing an opportunity to use her background to create tech solutions that could reinvigorate the doctor-patient relationship.

Ramamurthy noted that most physicians got into healthcare because they want to use their skills and expertise to treat patients, not to feel tethered to their keyboards.

“We need to work on building frictionless systems that take care of the doctors so they can do what they do best, which is take care of patients,” she said.

Built on Microsoft Azure — and working in tandem with the EHR — this new technology will marry the two companies’ strengths in developing ambient sensing and conversational AI solutions. Those include ambient listening with patient consent, wake-up word, voice biometrics, signal enhancement, document summarization, natural language understanding, clinical intelligence and text-to-speech.

Nuance is a leading provider of AI-powered clinical documentation and decision-making support for physicians. Leveraging deep strategic partnerships with the major providers of EHRs, the company has spent decades developing medically relevant speech recognition and processing solutions such as its Dragon Medical One platform, which allows doctors to easily and naturally enter a patient’s story and relevant information into an EHR using dictation. Nuance conversational AI technologies are already used by more than 500,000 physicians worldwide, as well as in 90 percent of U.S. hospitals.

Microsoft brings deep research investments in AI and partner-driven healthcare technologies, commercial relationships with nearly 170,000 healthcare organizations, and enterprise-focused cloud and AI services that accelerate and enable scalable commercial solutions. Earlier this month, for instance, Microsoft announced a strategic collaboration to combine its AI technology with Novartis’ deep life sciences expertise to address challenges in developing new drugs.

In other areas, Azure Cognitive Services offers easy-to-deploy AI tools for speech recognition, computer vision and language understanding, and trusted Azure cloud services can support the user’s compliance with privacy and regulatory requirements for healthcare organizations.

As part of the agreement, Nuance will migrate the majority of its current on-site internal infrastructure and hosted products to Microsoft Azure. Nuance already is a Microsoft Office 365 customer for its more than 8,500 employees worldwide, empowering them with the latest in collaboration and communications tools, including Microsoft Teams.

“We need to work on building frictionless systems that take care of the doctors so they can do what they do best, which is take care of patients.”

~ Dr. Ranjani Ramamurthy, senior director at Microsoft Healthcare

“Just capturing a conversation between two people has been a thorny technical problem for a long time, and a lot of companies have attempted to crack it,” Petro said. “This partnership brings two trusted healthcare superpowers together to solve some of the most difficult challenges and also to leverage the most innovative advances we’ve made in AI, speech and natural language processing.”

The companies will expand upon Nuance’s early success with ACI and expect the technology to be introduced to an initial set of physician specialties in early 2020, and then it will be expanded to numerous other medical specialties over the next few years, Petro said. Initially, the ACI output may be checked by a remote reviewer with medical expertise to provide an important quality check and produce additional training data for the AI models. Once the system has proven its accuracy for a given physician, the ACI documentation will go directly to that physician, who can review it, make any necessary revisions and sign off on a treatment plan all in real-time, Petro said.

With a patient’s consent, ACI is designed to securely ingest and synthesize patient-doctor conversations, integrate that data with information from an EHR, populate a patient’s chart and also help the EHR deliver intelligent recommendations to the doctor.

With innovations in multi-party speech recognition, language understanding and computer vision, these tools can listen to the encounter between the doctor and a patient who grants consent, sense whether they’re pointing to a left knee or right knee when verbally describing a particular pain, extract medically relevant details and translate what just occurred in the exam room into actionable clinical documentation and care suggestions.

“Moving forward, we recognize that reducing the burden of clinical documentation is just the beginning,” said Dr. Greg Moore, Microsoft’s corporate vice president for health technology and alliances. “As the core AI improves and becomes more capable, it will be able to understand much more deeply what is going on by observing doctors and nurses in their day to day work. Ambient clinical intelligence will be able to work in tandem with the EHR to help convert those observations into supportive, augmenting actions.”

For instance, an AI-enabled system can learn to recognize when a doctor is talking to a patient about a new medication, and it can automatically review past conversations as well as the patient’s history to reduce the risk of a drug interaction or allergic reaction. Or it can mine a patient’s complicated medical history with new reported symptoms and offer suggestions for potential diagnoses for the doctor to consider.

In addition, the two companies will open up the ACI platform to an ecosystem of partners than can bring other highly valuable AI innovations to the exam room or at the bedside where the ambient sensing device will be present.

“We want ambient clinical intelligence to assist the EHR in delivering recommendations at the time when it matters — not three days later on your patient portal or when a nurse follows up, but when the doctor and patient are face to face and when that information can actually inform care,” Ramamurthy said.

CREDITS: Microsoft Blog

https://blogs.microsoft.com/ai/nuance-exam-room-of-the-future/

Leave a Reply

Your email address will not be published. Required fields are marked *