Porter Research talks with Nuance’s Jonathan Dreyer about the rise of intelligent systems in healthcare, and how the cloud is poised to impact the next generation of mobile health By Jennifer Dennard, social marketing director, Porter Research
Porter Research talks with Nuance’s Jonathan Dreyer about the rise of intelligent systems in healthcare, and how the cloud is poised to impact the next generation of mobile health
By Jennifer Dennard, social marketing director, Porter Research
Not long ago, the Tricorder – that amazingly compact gadget used by docs in the Star Trek sick bay to scan patients for space ailments – was confined to the realms of science fiction. It’s worth noting replicas can be purchased on eBay for as little as $9.99. Providers, and even patients, interested in the real thing no longer have to daydream thanks to the Scanadu Scout. Currently in development, the Scout is a small, handheld scanner that measures vital signs. Assuming it passes FDA muster, it will be available in a year or two for $199.
Science fiction is quickly becoming a reality in the world of mobile health – one that is poised to have a tremendously beneficial impact on provider workflows and patient outcomes. Star Trek references aside, the future state of mhealth relies heavily on the rise of intelligent systems that power these kinds of space-age technologies. Porter Research recently sat down with Jonathan Dreyer, Director of Cloud and Mobile Solutions Marketing at Nuance, to better understand the role intelligent systems will play in the next phase of healthcare.
How do you define intelligent systems as they pertain to healthcare?
Intelligent systems have the ability to not only interact on a human level – by voice, gesture or touch – but they also understand and reason to deliver a desired outcome to someone – such as finding and instantly playing a movie.
More specifically for healthcare, these intelligent systems come in the form of natural, conversational and intuitive technologies that break down the technology barriers that sit between the physician and the patient - getting technology to work for doctors rather than against them.
Intelligent systems can help doctors address ever-changing technological shifts in order to get them back to the bedside, practicing the art of medicine. These solutions, powered by voice, language understanding and artificial intelligence technology, are also helping put the focus back on the patient. They enable physicians to interact naturally with the EHR and other clinical systems to quickly retrieve patient information, delegate and fulfill patient care orders, easily navigate electronic records, and more fluidly issue care directives. Ultimately, intelligent systems help to simplify the day-to-day duties of the doctor and other members of the care team, which has critical implications on us – the patients.
Virtual assistants no longer seem as out of this world as they once did. How are intelligent systems powering this new type of mobile health?
Virtual assistants are a new breed of intelligent system that aim to simplify how clinicians interact and engage with technology. They have the ability to dig for information within a patient's medical record, navigate complex healthcare applications and streamline end-to-end clinical documentation, all through conversational dialogue and commands. As the technology evolves, it will likely become increasingly more intelligent and possess the ability to understand an individual physician's wants and needs, learning more about them and becoming more proactive in its actions.
In addition, virtual assistants hold critical potential from a patient perspective. Now, more than ever, patients are focused on being their own health advocates and are seeking ways to leverage technology to take a more active role in their own care. Virtual assistants will act as enabling agents in this new era of patient engagement. From streamlining data input via patient portals to speaking to your phone to make smart dietary choices at the grocery store, the ease of use provided by calling upon virtual assistants to assist patients in making healthier, more informed choices holds significant power.
We have two development partners that exemplify where mobile health is going – Sense.ly and Landmark Hospital. Sense.ly technology uses a virtual avatar, speech recognition, body recognition, augmented reality, video and medical devices to keep a patient engaged as they communicate their symptoms, medical history and recovery progress. This communication can take place via a mobile phone, website or T.V.
Landmark Hospitals has built their own EMR, and they’re leveraging both the Nuance speech recognition technology and our clinical language understanding technology to extract facts from the patient narrative. That’s the start of getting into this type of intelligent system / virtual assistant workflow.
What role does the cloud have to play in this type of connected environment?
The cloud plays an extremely significant role in today's connected environment, especially as physicians and patients become more mobile. Cloud-based technology helps ensure a consistent experience across devices and platforms, and provides continuity with respect to how physicians and patients interact with technology.
Specific to voice recognition in the clinical setting, the cloud offers true anytime, anywhere access independent of operating systems, platform or device hardware specs. It also allows physicians to always have access to the most up-to-date medical language models and dictionaries, as well as user-specific customizations, regardless of the device or platform on which they were originally added or created. Furthermore, the cloud can connect with legacy desktop systems to provide access to years of user customizations across new and emerging mobile and Web-based healthcare apps.