Blog Post

Is There Anything Worse than a “Thin” Chatbot?

Nov 27, 2018 · 6 min read
2018-11-lifelink-mobile-healthcare-chatbots

Since the invention of computers over 50 years ago, the ultimate interface vision has been something similar to the Hal 9000 from 2001 A Space Odyssey. Not the sinister, self-aware machine that decides to start killing astronauts, but rather the advanced companion that humans converse with to accomplish great things. No keyboard, no touchscreen, no mouse required. Interaction is a human to humanoid conversation — fluid, natural, and simple.

We are on that road. Think about it. We have gone from punch cards to green screens, to the graphical user interface, to the Internet, to mobile, to IoT, to smart devices, and to self driving cars. Technology evolution is exponential and that means advancements will not only keep happening, but they’ll keep happening at a faster rate. AI powered chatbots represent the next innovation wave of human-machine interaction.

 

Talking to Devices is Now Kind of Normal

Most of us have now interacted with a smart device through some kind of verbal cue. Maybe you get answers to obscure questions by just asking Siri. It sure beats typing questions and instructions on that tiny keypad, which, ironically, can be made even more cumbersome by the phone’s AI-powered auto-correct feature.

I cannot tell you how handy (and cool) it was when I attended an ELO concert recently and marveled at how incredible Jeff Lynne sounded. Close your eyes, and it’s 1978 all over again. “How old is that guy” my friends and I wondered? Siri provided the answer in 7 seconds, which was just about as amazing as the fact that Jeff is belting out Turned to Stone at the ripe age of 71!

If you have an Alexa or Google Home, you have advanced to the next level because you are also asking the device to physically do something like “Play an ELO song” or “Turn off the heat” or “Order more Tide laundry detergent.” And it generally works. While still crude, the trajectory is obvious — we are on the cusp of smart, conversational interactions with machines.

 

Conversational Chatbots Have Massive Commercial Potential

Chatbots are getting a lot of press about how they represent a new and better way of human-machine interaction. After decades of building websites and mobile apps, we have discovered that simply digitizing libraries of information, while incredibly helpful, still doesn’t address the fundamental engagement problem. People have short attention spans, they don’t want to dig around to get answers, they don’t remember passwords, and they’re lazy. Texting, Instagram, Twitter and Snapchat are popular because they are simple and brief. And they’re designed for mobile.

The promise of chatbots is that they behave in a way that is similar to the way we interact with friends. Instead of requiring people to go to a place to find what they need, a bot interacts through text-like messaging, progressing down an engagement path that’s conversational and smart.

 

Still, Thin Bots are Popping Up Everywhere

Despite significant advancements in chatbot technology, there are a LOT of bots out there that do very little beyond saying “hello” before offering up a batch of Web content or a human agent. This is the “thin” experience of a thin bot — it’s underwhelming, shallow, and devoid of the true intuitive conversational engagement we expect from a digital agent that proposes to converse with us.

I recently engaged a website bot as part of an information search at a large business software company. What started out as “how can I help you” quickly morphed into a text exchange with a live salesperson looking to get a deal going. I felt betrayed by the bot as I was only seeking some basic information. Instead, I found myself fending off a salesperson that had taken over mid conversation for the chatbot named Sally.

Worse, many of these thin bots fail at the outset of the conversation because they’ve relied too heavily on AI tech — Natural Language Processing tech—that is still, fundamentally and predominantly, a work in progress, despite all the hype. Pick practically any bot from the thousands on Facebook Messenger today and you can see for yourself just how quickly and easily your bot will find its way into the veritable ditch of confusion and misunderstanding.

A few weeks ago I was greeted by a chatbot on a large shopping mall website. Upon being asked what I was interested in, I simply typed in “food”. The result was: “I’m sorry, I don’t understand, please call this 800 number.” Are you kidding me?

When a thin-bot interaction like this fails, the failure is stark — and super frustrating. It feels regressive, not progressive, not clever at all. Even the old “Dial 1 to speak to a representative” would be better! Take me back to 1994.

 

NLP is Still a Work in Progress, so Designers be Wary

NLP and the open, unbounded text entry box that prompts NLP, can be very cool when it works flawlessly. In well-controlled demos it can work miracles. But the nuances of natural human conversation are so complex, and the threshold for conversational success is so high in so many business domains that designers and builders of bots must be careful not to over-use and over-rely on NLP-dependent interactions. 87% accuracy in healthcare conversations, for example, simply will not cut it.

One of the leading design strategies is to remove the open text entry option entirely and serve up smart “response bubbles” for the user. All the user has to do is click the option closest to her intent or question, and the conversation continues. The AI in this strategy is directed at serving up the most likely response bubbles, based on all the known data: what is known about the user, what is known about her engagement patterns, her choices so far in this conversation, her choices in previous conversations, her selected preferences, etc. Doing this well at scale requires AI, but a different branch of AI than the AI used in natural language understanding (NLU) and natural language processing (NLP).

Conversational flows can be powerfully engaging, but they don’t need heavy NLP to be so. Designers are trying to pull off the delivery of a well-scripted consumer journey that is both anchored in a specific business or workflow context so the choice set is not infinite, but, rather, constrained to a specific sub-domain where the pathways are known and well documented. These journeys follow multiple possible pathways are constructed dynamically, as the user interacts with the bot and the data around her accumulates. In these kinds of conversational flows, the action is dynamic because it’s data-driven.

We can’t know the full data set or the timing of data provisioning along the conversational journey, but the bot is listening for signals, and designed to react intelligently. The AI brain consumes data in real-time and executes conversational flows based on the data it receives. A brain that is capable of pulling this off in serious business domains — like healthcare, for instance — is massive, widely integrated, multi-circuited, data-rich, and imbued with powerful controlling functions that applying the learnings at the individual conversation level, across millions of conversations, simultaneously.

So, if we care about what a weather forecast of rain two days out might mean for a patient who’s scheduled for a colonoscopy two days out, a well-designed bot — a bot whose job is to remind the patient of her appointment and to ensure she’s prepared and arrives on time, for example— might offer up an Uber ride to avoid the risk of a last minute cancellation. “Looks like rain on Thursday. Can I arrange an Uber ride for you and your companion?” the bot might say. This exchange and thousands of others like it require no NLP. But they do require a lot of smart tech, AI, and great upfront dialog design.

Fortunately, there is a lot of proven foundational tech available today that can indeed deliver on the promise of natural and engaging conversations that automate consumer-facing workflows in business. The conversations are guided and, yes, they can feel that way, but they’re also surprisingly flexible, dynamic, and useful.

Rather than over-weighting an NLP engine with open text entry fields that must be interpreted exactly right, but which can too often be precisely wrong, designers of useful bots should be thinking about employing AI to expertly surface options and pathways that users are most likely to want and select in order to avoid the inevitable and catastrophic trap of an NLP failure and a “thin bot” experience. These should not be exercises in AI vanity but rather pursuits of well-designed conversational experiences that solve specific high-value business problems.

As NLP improves, and it certainly will, designers can begin to take advantage of powerful collection of natural language understanding tools to improve their conversational experience products even more. And the designers of thin bot experiences can begin to fill in the gaps and pitfalls that have plagued their bots from the outset and made them largely unusable.

The foundation-level interaction flows that can transform consumer engagement are already being designed and deployed today for a wide range of complex, multi-step, long time arc tasks — in healthcare workflows especially, where patient engagement is the watchword for leading healthcare providers. These chatbots work. Some are splendid and all, if well-designed, are surprisingly engaging. And soon enough, these conversations will be augmented further with even more power and depth, including full voice recognition and response capabilities too which will take us a step closer to the Hal 9000 or C3PO. It could happen sooner than we think, so get ready.