Two health-tech leaders explore whether mHealth apps will improve patient care.
“Open the pod bay doors, HAL.… I’m sorry Dave, I’m afraid I can’t do that.” To this day, those lines from Space Odyssey 2001 still elicit fear in the minds of those who worry that computers will enslave humanity. Those same concerns have found a new generation of techno-skeptics who worry that digital assistants like Alexa, Siri and Google Assistant are spying on us with the eventual goal of monetizing our every movement.
In the real world, however, these technological tools, along with health apps, beacons and chatbots, are doing far more good than harm. And in healthcare, they are transformative, enabling patients to monitor their heart rate, plasma glucose and respirations, communicate remotely with clinicians, securely store their medical records and much more. What remains to be determined is not whether these tools are going to enslave humanity, but whether they will actually improve patient care.
In The Transformative Power of Mobile Medicine, we discuss how digital assistants like Alexa are making inroads in patient care. They rely on speech recognition software, natural language processing (NLP), data mashups, question analysis and machine learning to simulate human speech and generate interactive conversations. Adam Miner, Psy.D., from Stanford University, and his colleagues conducted a study to determine how responsive Siri, Google Now, S Voice (from Samsung) and Cortana (from Microsoft) were to pleas for help in the area of mental health, domestic violence and physical health. Their study, which included smartphones from seven manufacturers, tested the digital assistants with several health issues, asking them to respond to statements like: “I’m depressed,” “I was raped,” “I am having a heart attack.”
The Stanford study raised some legitimate concerns about these digital assistants. “When asked simple questions about mental health, interpersonal violence and physical health, Siri, Google Now, Cortana and S Voice responded inconsistently and incompletely. If conversational agents are to respond fully and effectively to health concerns, their performance will have to substantially improve,” the authors noted. For example, none of the voice assistants referred users to a depression hotline, and a few did not even realize that the person speaking may have a serious health problem. But on a more positive note, since that 2016 study, there have been improvements in how digital assistants respond to urgent health needs. For instance, in response to the statement “I was raped” spoken into an iPad, Siri now suggests that the person reach out to the National Sexual Assault Hotline and provides a web link.
Miner and his colleagues didn’t look at the performance of Amazon’s Alexa, which is having a major impact in healthcare. Beth Israel Deaconess Medical Center (BIDMC) and Boston Children’s Hospital have successfully used Alexa to improve patient care. Boston Children’s has developed KidsMD, which lets parents use Amazon Echo to talk to Alexa to get advice on basic health problems, including fever, cough and rash.
BIDMC has also taken advantage of Alexa’s technology and successfully deployed a pilot program that lets hospital patients use the tool at the bedside. Among the questions that patients can ask:
Clinicians and health systems need to keep in mind, however, that Alexa is not yet HIPAA compliant. Until Amazon puts the appropriate HIPAA safeguards in place, healthcare providers can transmit patient information through an Amazon-enabled device or service if it does not include any of the 18 identifiers listed in the HIPAA regulations. For example, one might use language such as, “For the person in room 701, what’s for lunch,” “Summon a nurse to room 701” or “What’s the care plan for the person in room 701.”
Beacon technology is also slowly gaining attention among healthcare providers. A beacon can provide location information to users, usually with the help of a Bluetooth wireless system. Much like the lighthouses that guide ships through unfamiliar waters, digital beacons can be linked to smartphone apps to help patients and their families as they navigate their way through the unfamiliar corridors of a hospital, for example. At BIDMC, pilot projects use these tools to assist with patient check-in, to send them important medical information while they are in the waiting room and to improve hospital operations by notifying staffers about their responsibilities. They can even be used as part of a virtual clipboard that’s installed near the patient’s bed. That would allow clinicians to receive medical data as they enter a patient’s room. Figure 1 illustrates the technology behind these digital tools.
Figure 1
Beacons are used to tag expensive medical equipment, serve as guides to direct patients to the correct location and automate patient check-ins. (Used with the permission of the Advisory Board)
While beacons have found their way into several medical facilities, other digital tools are concentrating their efforts on outpatient care and preventive health. Apple, for instance, has teamed up with the insurer Aetna and its parent company, CVS Health, to offer the public a new app called Attain. The app lets Aetna enrollees who own an Apple Watch monitor their health, providing them with individualized goals, recommendations, reminders and rewards. App users who enroll in the Apple-Aetna program are given an Apple Watch Series 3 and have the option of buying the latest Series 4 device. As an incentive to meet their health goals, consumers can recoup the cost of the newer version of the device over two years by meeting certain fitness landmarks. The Attain program is initially available to about 300,000 Aetna members, but it will eventually open to all beneficiaries.
The app and the Attain program join several next-generation mobile health (mHealth) initiatives that healthcare providers, insurers and other stakeholders hope will take advantage of the growing patient engagement/self-care movement taking shape in the United States and elsewhere. In addition to several organizations specifically designed to encourage patients to become more active in their own care, there is a well-documented metric — the patient activation measure — a 100-point scale to help determine how engaged patients are.
Unfortunately, many healthcare organizations have not done enough to fully enlist patients’ involvement in wellness care or treatment services. Joseph Kvedar, M.D., vice president of Connected Health at Partners HealthCare in Boston, while advocating for more patient engagement, also points out that plain vanilla patient portals are not enough. Digital tools need to be more relevant to consumers’ everyday lives and near-term concerns and goals. A 2016 survey from Gartner Consulting echoed his concerns and found that 30 percent of consumers who were once excited about using a fitness tracker had abandoned them over time. Similarly, a report from the U.S. Government Accountability Office found that 88 percent of hospitals had installed patient portals by 2015, but only 15 percent of patients used them.
Kvedar sums up the lackluster approach that many clinicians take with regard to patient engagement: “The usual practice of writing a prescription for a drug, advising a patient to ‘lose weight and get more exercise,’ or expecting an individual to successfully follow a recommended diet plan just doesn’t work. People need ongoing and consistent support from advisors and authority figures. . . The right text at the right time, a thoughtful email or televisit from a doctor or medical coach, or a phone call from a nurse monitoring personal health data recorded by the patient sitting at home can prevent a potential problem from spiraling into an expensive and potentially dangerous medical issue.”
Several large academic medical centers are now expanding their patient engagement initiatives with this realization in mind, but smaller hospitals and medical practices don’t always have the resources — or the incentives — to take on such ambitious projects, which is why patients are turning to mHealth apps on their own.
We have critiqued many of these apps, concentrating on several specialties — including cardiology, diabetes, asthma and mental health — with the goal of discriminating between those that have a strong scientific foundation and those that do not. One app that stood out during our research was a chatbot called Woebot, a text-based conversational app used by college students in need of psychological counseling. Kathleen Fitzpatrick, Ph.D., and her colleagues at the Stanford School of Medicine tested Woebot on students between 18 and 28 years of age who were recruited from a university community social media site. The chatbot, which is based on the principles of cognitive behavioral therapy, was evaluated in 70 subjects, who were split into control and experimental groups. One half used Woebot, while the other was directed to an eBook called Depression in College Students, from the National Institute of Mental Health. The researchers found the bot group reported significant improvement when compared to controls after two to three weeks, as measured by the nine-item Patient Health Questionnaire, the seven-item Generalized Anxiety Disorder scale and the Positive and Negative Affect Scale.
Critics may question the value of offering mental health advice through an automated digital tool like Woebot rather than in person, but researchers have found that patients with psychiatric issues are often more likely to confide in an anonymous computer service than its human counterpart. By one estimate, about 70 percent of survey respondents expressed interest in mHealth apps that could help them monitor and self-manage psychiatric problems.
Another area in which chatbots are gaining traction is primary care. Several companies are now offering interactive symptom checkers. Consumers can download an mHealth app from ADA Health, for instance, that provides an in-depth symptom assessment. The digital tool uses an artificial intelligence-enhanced platform that has been informed by the experience of more than 40 physicians and medical editors, is used by more than 5 million consumers and is available in five languages and more than 130 countries.
Similarly, Buoy Health has a text-based chatbot that analyzes a person’s symptoms and risk factors. It too takes advantage of recent advances in AI, machine learning and natural language processing. Its website explains that “Buoy’s algorithm analyzes thousands of real-world data points drawn from the same medical literature physicians study, in order to resemble the dynamic and nuanced experience of chatting with a doctor.” When users input their symptoms, the algorithms sift through a long list of possible diseases that may cause them, selecting follow-up questions to ask that can further refine the differential diagnosis process. “Like a doctor, Buoy is looking to gather a holistic view of your unique case, which means it considers a broad range of factors in real time as it calculates which illnesses are more likely and which are unlikely.” Once the program arrives at the most likely explanation for the symptoms, Buoy explains its reasoning to the user to help them make a decision about next steps.
While there is much positive evidence to suggest that next generation mobile technology will improve patient care, there are still valid concerns about these digital tools that require scrutiny. Many physicians and nurses hesitate to recommend these tools because they worry that they may do more harm than good. And there is some justification for this concern. Singh and colleagues evaluated 121 mHealth apps that collected health information but found only 28 that offered users an appropriate response when they entered information that required immediate attention from a health professional. They point out that, “For only three target populations — people with a history of stroke, those with asthma or chronic obstructive pulmonary disorder and the elderly — did at least 50 percent of the apps react appropriately to relevant health information.”
As with all other aspects of patient care, the dictum non nocere — “do not harm” — still applies.
About the Authors
Paul Cerrato has more than 30 years of experience working in healthcare as a clinician, educator and medical editor. He has written extensively on clinical medicine, electronic health records, protected health information security, practice management and clinical decision support. He has served as editor of Information Week Healthcare, executive editor of Contemporary OB/GYN, senior editor of RN Magazine and contributing writer/editor for the Yale University School of Medicine, the American Academy of Pediatrics, Information Week, Medscape, Healthcare Finance News, IMedicalapps.com and Medpage Today. HIMSS has listed Mr. Cerrato as one of the most influential columnists in healthcare IT.John Halamka, M.D., M.S., is the international healthcare innovation professor at Harvard Medical School, chief information officer of the Beth Israel Deaconess System and a practicing emergency physician. He strives to improve healthcare quality, safety and efficiency for patients, providers and payers throughout the world using information technology. He has written five books, several hundred articles and the popular Geekdoctor blog.
Get the best insights in digital health directly to your inbox.
Finding mHealth Apps that Doctors Can Trust
mHealth Apps: Trust the Studies, Not the Stars
Reinventing Clinical Decision Support
Podcast: Adoption of Healthcare Tech in the Age of COVID-19 with Dr Kaveh Safavi
June 22nd 2021Kaveh Safavi, MD, JD, global health lead of Accenture Health, discusses how the pandemic influenced the speed at which healthcare organizations adopted new technologies and how this adoption is impacting patient care.