Flag

We stand with Ukraine and our team members from Ukraine. Here are ways you can help

Get exclusive access to thought-provoking articles, bonus podcast content, and cutting-edge whitepapers. Become a member of the UX Magazine community today!

Home ›› Usability ›› Why does your Chatbot Suck?

Why does your Chatbot Suck?

by Carylyne Chan
9 min read
Share this post on
Tweet
Share
Post
Share
Email
Print

Save

The premise behind why the chatbots of today are not up to par is largely due to the older paradigm of using keywords as a way of understanding what customers are asking.

With all the hype around building bots these days, you may have built one but realized it’s not as good as you’d hoped.

The big question is: Why does your chatbot suck? And is it augmentable?

The premise behind why the chatbots of today are not up to par is largely due to the older paradigm of using keywords as a way of understanding what customers are asking. This means that a lot of questions are misinterpreted – ending up frustrating the customer. There are also bots that have screwed up, like the infamous Tay, which did not factor in the corrosiveness of user inputs.

Bots have the potential to replace apps and websites by letting users buy items, book services,be entertained and more. However, since bots are a fundamentally new way of interaction for users, being without an interface, it becomes even more important to design the flow of the experience so users have clear expectations and hints about what they can do with the bot.

Here we discuss what most of us overlook when we first build a bot, and what to do to improve the bot’s effectiveness.

1. Build in edge cases

These are the that your bot might not be equipped to handle, such as:

Multiple questions

Example: “Do you know if I can upgrade my plan? If I do, how much will it cost?”

Our thoughts fire in succession, so it’s also normal that we ask questions in the same way. You may have noticed that customers write a long question with multiple parts, but the chatbot is unable to sufficiently answer to each of these questions.

With multiple questions, your bot needs to understand that there is a segregation between the question parts, and perhaps what the relationship is between sentences. Here is where context becomes extremely important (read on to find out more about context for bots) in figuring out and attending to each of the questions posed by the customer.

If the bot only answers to the first question, or doesn’t understand that the questions are related to each other, then users will quickly get frustrated and lose patience with the bot.

Seeing as most of us use bots to reduce the time and effort we have to spend going to a store, staying on the phone for hours, or downloading yet another app, it basically renders the utility of the bot zero.

A well-designed bot would take these multiple inputs into account, so as to better form a full picture of exactly what the user is attempting to achieve by talking to your bot. Mapping and correlating instances of questions appearing near each other would be ideal to see related topics that customers are most concerned about so you can address them proactively.

Complex questions

Example: “I’m trying to get it to work, and I’ve tried turning it off and on, cleaning the port, and doing a soft reset. Can you help me?”

Troubleshooting questions are usually too complex for most chatbots. As it is, it is pretty hard for most chatbots to understand standard questions and answers in context and give a satisfactory answer.

Special training is required here. There needs to be an inbuilt decision tree that captures a range of potential solutions, and use the chatbot’s natural language understanding engine to figure out which steps the customer has already tried, and then filter them to the right direction.

By fully mapping out scenarios, which is just like we would map entire user flows, would mean that a majority of outcomes that users anticipate have been catered for, and your bot can answer immediately with other new suggestions — especially if your bot is meant to quickly help customers with troubleshooting or walk through set up procedures.

With rigorous training of these potential flows, users will get an ideal experience in terms of getting the right answers to questions most of the time; saving them time, and saving you the need for spending lots on support too.

Situational questions

Example: “I want to access my order info, but your site is down and I can’t sign in.”

Temporary situations such as website being down, a flash sale, or limited offers, require special information that the chatbot may not originally have known when you first taught it the basics of the questions and answers it’s expected to know.

Imperative in this situation is having a flexible enough system that allows you to add these new cases that you want the bot to be able to handle (especially during those festive seasons!) so that you can take care of more customers, faster.

Feedback or feature requests

Example: “I really think that you should add this awesome new feature, it would help me so much.”

These are important bits of information that could help with product-building, so you want the chatbot to capture as much detail as possible from the customer, such as “Could you describe what you envision it to look like?” or “What else do you think we could change so that it suits your use case?”

Most of the time, people go to the same channels for support as well as for recommendations and feedback, so it’s important to keep these aspects in mind when designing the chatbot’s knowledgebase and language generation engine.

By being ready for scenarios outside of what you intended the bot to do, such as these, will increase the usefulness of the bot for customers as an end-to-end support bot, and can also be a wealth of information for you based on what specific customers are requesting for (like future segmentation and targeting.)

Insults

Example: “You’re useless.”

It happens – people get frustrated and start hurling insults at the unsuspecting chatbot. The same advice holds from customer service 101: Don’t take it “personally” (heh) and think of the different ways you can resolve the issues instead.

This is also a juncture where the bot needs to recognize it may not be doing a good job, and hand off the conversation seamlessly to a human agent who may be better equipped to handle the customer.

The last thing that we would want is for an irate customer to be verbally abusing your helpless bot without a resolution in sight. Since we realize that insults generally hint at suggestion, it is better to err on the side of caution and transfer them to an agent who can better handle the issue (and amok emotions) instead, so that you can retain your customer relationship.

Chatter

Example: “Where were you born?”

You’ll be surprised how many people ask chatbots questions like this. By not preparing the bot’s “backstory” you may inadvertently be churning out a lot of “I don’t know what you mean”-types of responses that makes your chatbot look dumber that it actually is.

This is a great place to play up your brand personality and add in content that would surprise and delight your customers. The brand’s backstory can really shine through if you take the time and effort to teach it.

All these edge cases can be thought through and built into the training material that you feed your bot. Like screen states, these are the essential components that make up the conversational UI-less experience.

2. Understand context

As we’re aware, context is a critical principle in UX when considering what the user may want or expect.

There are a few elements of context that most chatbots these days ignore. First, language context – There are plenty of examples of assistants not understanding simple sentences like or beingvery unhelpful with their suggestions. These, as briefly explained above, are examples that show just how difficult it is to tag and parse natural language, especially when they come in many forms from different people. The ambiguity of sentences in dialogue make it especially difficult to decipher without large amounts of training data.

Second, it ignores the customer’s context, who may have asked the same question before, or just got off the phone with your support helpdesk. Being asked to repeat their security answers and issues over and over when being transferred among service agents vexes most of us greatly.

Situational context is also largely disregarded. When you have a new product launch, or when your site is down, the bot is isolated and doesn’t understand it needs to respond to people with new and urgent questions (see following section.)

Be sure to factor in these various contextual cues in both the natural language technology stack that you choose, and the integrations that you pipe into supporting the bot.For example, if you sell products with an inventory, then ideally the bot should understand what your customer is asking for and check if you have what they want available.

3. Anticipate integrations

A major reason customers get frustrated at bots is that they can only answer commonly asked questions, and have no real access to their accounts or past histories. Customers are not able to conduct any meaningful business with your bot, or even ask questions specific to their orders or plans.

Non-integrated bots are a huge lost opportunity for engagement and marketing over time. By having a chatbot that understands your customer’s needs, it will be able to provide tailored, predictive replies to what it anticipates the customer could want. For example, knowing that someone has just placed an order with you, your bot can attend to delivery status requests, or provide tailored recommendations to your newly acquired customer.

It is important for your chatbot to also be integrated with the CRM, ticketing and fulfillment business processes and systems so that you can accurately and swiftly catalogue and assess a customer’s status at all times. This could turn out to be a huge competitive advantage for your business, especially right now.

4. Have an escalation protocol

It’s inevitable that sometimes the bot will be stumped. You need to identify the right times that it doesn’t know what to say, and quickly divert the conversation to someone (probably human) that respond to the customer.

When identifying this escalation point, either through a combination of time cues or a low confidence answer that the bot is about to give, have a friendly message at hand that tells the customer they’re being passed to a subject matter expert so they will be put at ease that their problems are being solved.

All the rules of good customer support apply here!

5. Self-learn

In addition to these challenges, it may also be hard to add new questions and answers to your bot, and when you have new knowledge that you want to teach it, It may require a long process between you and your chat provider. It could involve a lot of dawdling around before information is added.

As part of its capabilities to learn, a bot must be able to quickly ingest and understand your previous interactions with customers and hypothesize what its answer may be to them. This machine learning process should be fast enough so that implementation can be swiftly up for testing by your sales and support staff, and even a select group of your customers. So pick the right framework, and run with it.

Can you imagine chatting with a bot that takes 5 minutes to reply to your questions, when you are facing an emergency (or at least an urgent enough issue you bothered to look for the bot in the first place?) By stress-testing user loads, as we would test our current websites and apps, we can ensure that bots are functioning optimally even at scale.

A word on frameworks: With building bots being a cross-functional job as well, it is important that product owners, managers and designers can also add to the capabilities of the bot, such as in its linguistics capabilities (adding words, synonyms that users would use; adding new ways of asking questions; and so on.) Just like a good product speaks the user’s language, a bot needs to understand how the user speaks, and what they are looking for, to be a good bot.

What now?

Underpinning all of these factors is the fact that natural language processing is a highly technical and complicated space, involving areas such as tagging, parsing and specific expert areas such linguistics. Lastly, don’t forget robust security and privacy safeguards are important for protecting your users. As most of the points above may not have been considered when you started working on your bot, now is a good time to reevaluate how your chatbot is doing and what you can do to improve it.

Remember, in a medium with no UI, the bot needs to take all these factors into account for an amazing experience.

post authorCarylyne Chan

Carylyne Chan
I'm a co-founder of KeyReply, an artificial intelligence-powered chatbot-as-a-service for midmarket and enterprise clients in the hospitality, ecommerce and software space. I work on product, design, marketing, data crunching and front-end (though my job scope sometimes changes day-to-day.) I have built and launched products at companies such as 3M, American Express, AGT International, and strategy + account management at DDB, among others. I'd love to talk to you, so get in touch!

Tweet
Share
Post
Share
Email
Print

Related Articles

Join the UX Magazine community!

Stay informed with exclusive content on the intersection of UX, AI agents, and agentic automation—essential reading for future-focused professionals.

Hello!

You're officially a member of the UX Magazine Community.
We're excited to have you with us!

Thank you!

To begin viewing member content, please verify your email.

Tell us about you. Enroll in the course.

    This website uses cookies to ensure you get the best experience on our website. Check our privacy policy and