Why the Future of Healthcare Chatbots Looks More Human Than Machine

Healthcare chatbots must be more human than machine to succeed.

They’re called Alexa, Siri, Viv and Cortana, named for a character in the hyper-violent Halo videogame franchise. What’s considered the very first was christened Eliza when it launched in the mid-1960s.

No matter what the chatbot is named, its future in healthcare may be limited by something we humans take for granted: the innate ability to communicate with passion, empathy and concern. These traits are especially important in healthcare where conversations between provider and patient are often tense and emotionally-charged.

So the top concern of healthcare chatbot developers must be working to improve interactional skills: To make them more human when connecting with their human customers.

Researchers publishing an overview of conversational healthcare agents in the Journal of the American Medical Informatics Association divide the group into three areas:

  • Chatbots having the ability to enter into casual conversations;
  • Avatar-based chatbots that look something like humans and can “talk” to users; and
  • Smart-speaker-type interfaces with the ability to take “orders” and answer simple user questions.

The powers behind general and healthcare chatbots are artificial intelligence (AI) and natural language processing. AI inherently has no empathy, so feelings must be built into the program and even then this digital technology has only the emotions developers assign. A healthcare chatbot could be extremely formal, another very friendly, while another might be a computerized reflection of one’s self. No matter how the healthcare chatbot presents itself and communicates—like an overly-formal dissertation or using Millennial slang—imparting caring and emotion to the transaction and conversation is a challenge.

Healthcare chatbots, basically, need to be more human and less machine-like. Researchers surmise patients will grow attached to a healthcare chatbot if it has linguistic and anthropomorphic prompts similar to that of the user. “(R)esearchers are suggesting that conversational agents need to adopt the characteristics of human-human interaction in order to be more engaging. To this end, human-computer interaction researchers have started to study how users react to anthropomorphic characteristics of conversational agents and how such cues affect the behavior of individuals,” according to Swiss researchers. “(P)atients may also perceive a sense of interpersonal closeness with a THCB” or a “text-based healthcare chatbot.”

Chatbot Market Set for Massive Growth

This not-yet-perfect-AI-powered digital technology is here to stay and seems poised for significant worldwide growth. The global healthcare chatbot market accounted for approximately $97 million in 2017 and is expected to grow to more than $618 million by 2026, growing at a CAGR of 22.8 percent, according to Stratistics MRC.
Related healthcare chatbot and app news shows hundreds of tech companies are in the midst of developing and launching smartphone apps and computer programs focusing on everything from elder care and aging to mental health to stress relief. Start-up funding for the companies building these digital health technologies is approximately $2 billion. (Undoubtedly, many of these companies will use chatbots to work directly with consumers to gather information and respond to simple and complex questions.)

In addition, the healthcare virtual assistants market, which includes smart speakers, is projected to grow significantly to $1.7 billion by 2024 from $391 million in 2019 with at a CAGR of 34.6 percent, according to ResarchAndMarkets.com. “Growth in this market,” the organization explains, “is driven by the growing number of smartphone users and the increasing use of healthcare applications, the growing demand for quality healthcare delivery, and the rising prevalence of chronic disorders.”

Are we ready to fully surrender our complex health challenges to healthcare chatbots? No doubt many of us already do so today or are getting comfortable with the idea by using digital personal assistants. The research, so far, is mixed and patients have yet to move en masse to the digital technology.

Chatbots and Your Health

Your data, my data; it’s out there and it’s not coming back. In a day when search engines can locate and return just about anything about anyone, should we be concerned about getting too close to chatbots? They are simply another form of digital technology getting to know us on a very personal basis. Could chatbots, though, become too chatty?

Penn State researchers say “no.” While the group’s study published in Cyberpsychology, Behavior & Social Networking was small, researchers found participants got along better with a caring chatbot compared to one that dispensed advice in a cold, clinical way. “In the sympathetic version, the chatbot responded with a statement, such as, ‘I am sorry to hear that.’ The chatbot programmed for cognitive empathy, which acknowledged the user’s feelings, might say, ‘That issue can be quite disturbing.’ A chatbot that expressed affective empathy might respond with a sentence that showed the machine understood how and why a user felt the way they did, such as, ‘I understand your anxiety about the situation.’”

In fact, the Penn State researchers went one step further, explaining that a sympathetic chatbot could win over non-believers, those cynics unwilling to embrace the entire machines-can-have-feelings paradigm. “Data reveal that expression of sympathy and empathy is favored over unemotional provision of advice, in support of the Computers are Social Actors (CASA) paradigm. This is particularly true for users who are initially skeptical about machines possessing social cognitive capabilities.”

In the UK, there seems to be less comfort in replacing doctors with chatbots. In a survey of 4,200 people, 61 percent said they’d rather have a phone call or a video chat to talk about health issues. Only 7 percent—among all age groups—were interested in a chatbot consultation.

Returning to my initial question: Are we ready to share even more with technology? If you’re not ready now, it time you likely will be and to some extent, you won’t have a choice. “As consumers become more and more comfortable digitally sharing their personal information,” explains UserTesting research, “accessing medical advice or information via an app may soon become the new normal.”
Eliza-Chatbot-Screenshot.JPG
My Chatbot Chat

With that notion in mind, I decided to give Eliza a try and put her to the Turing test. I didn’t get far and found that no matter how much chatbots in the healthcare world may sound and act like a human today, and likely more so in the future, they are, as Morpheus posited in The Matrix simply computer software constrained by a set of programming rules to which they must comply. This makes chatbots less flexible and adaptable than humans.

In fact, the Journal of the American Medical Informatics Association researchers found healthcare chatbots utilize “(r)ule-based approaches used in finite-state dialogue management systems are simple to construct for tasks that are straightforward and well-structured, but have the disadvantage of restricting user input to predetermined words and phrases, not allowing the user to take initiative in the dialogue, and making correction of misrecognized items difficult.”

These challenges and others were confirmed by UserTesting, which found the five healthcare chatbots it studied suffered from three issues:

  • Chatbot apps do a poor job of handling complex conditions.
  • Chatbot apps still need to earn consumer trust.
  • Basic usability issues detracted from the experience.

So when I asked Eliza how she was doing, I received a confused, existentially-loaded answer from a chatbot desperate to understand and confirm its humanness. (A screenshot of the conversation is above.)

“Hello, I am Eliza.”
“How are you?” I typed.
“Why are you interested in whether or not I am?”

Before healthcare chatbots can bridge the divide between machine and customer and earn patient confidence, they will need to have an inherent understanding of themselves and, more importantly humans, while communicating with trust, emotion and empathy.