A couple of weeks ago I was chatting with someone at a startup event, and when I mentioned that our business deploys chatbots to help save agents’ time, their eyes rolled at recollections of dead end conversations and endless loops to nowhere.
It’s a common experience we’ve all had. Chatbots have had a poor reputation. Customer service bots have relied on limited artificial intelligence (AI) and therefore could only respond to narrow, pre-programmed queries. Rigid and robotic, early chatbots frustrated more users than they helped.
But now that’s all changing. Thanks to recent leaps in artificial intelligence, specifically the birth of large language models (LLMs), the chatbot landscape is changing dramatically, and with it, what it means for many types of business.
Let's explore how this new AI works and the ways it's enhancing chatbots and the customer experience.
The Old Bot Struggle
Building effective chatbots used to be an uphill battle. Developers had to manually map out every conceivable conversation path and script appropriate responses. This required tremendous upfront time and effort. And then came the constant tweaking as customers inevitably had questions outside the predicted scope that made the bot go into a terminal spin.
Even with meticulous design, traditional chatbots still sounded stilted, as they lacked the ability to comprehend language, relying on keywords rather than understanding context (this is a key difference between the “old” way and the new “LLM” way). This led to the feeling that the chatbot you were having a conversation with just didn’t get it.
As you can imagine, these limitations produced many unsatisfying chatbot interactions, as my friend above can testify to. This led to many businesses wondering whether the investment was just a waste of time and effort.
The Generative AI Difference
The latest AI innovation that’s shifting the chatbot game is large language models (LLMs). LLMs like GPT-3 and similar (which power end user tools like ChatGPT) absorb massive text datasets, learning the statistical patterns of human conversation.
This allows them to generate remarkably natural sounding responses on the fly instead of following rigid dialog trees which makes bots sound like, well, a robot. Hence the tag “Generative AI” for these new types of AI models.
LLMs also have stronger contextual understanding compared to previous “old” AI. They can parse nuanced customer questions and provide accurate answers, not just generic (and often wrong) responses based on recognizing keywords.
With LLM intelligence, chatbots deliver some key advantages over “old” style bots:
> Faster, cheaper development - substantially less resource is now required to create your own chatbot
> More accurate responses - LLMs better interpret varied questions on a topic, and can extract answers just from your own data (more on that another time)
> More humanlike conversations - The AI's writing flows smoothly and conversationally
> Easier scalability - Adding new info is faster vs. traditional chatbot rebuilding
> Lower maintenance - Less oversight needed with dynamic AI responses
LLMs have only been commercially available since the beginning of this year, so their impact on the chatbot world is only just being felt - but it’s growing fast.
Chatbot Success Stories
With their upgraded AI capabilities, chatbots are now yielding impressive results across many customer service scenarios. Here are a few examples:
> FAQs - LLMs provide consistent support for common questions on policies, orders, etc.
> Troubleshooting - Chatbots can leverage product docs and data to diagnose issues and advise solutions.
> Customer service - Chatbots can be left to handle straightforward inquiries, with more complex questions being handed over to human agents.
> E-commerce - Shoppers get quick access to inventory, recommendations, and their order details.
The Future with LLMs
Chatbots have turned an important corner thanks to leaps in AI.
As large language models grow more sophisticated, chatbots will become an even bigger competitive advantage. Their abilities to converse contextually and access business’s specific data will only improve their usefulness. Soon chatbots will be on a par, or exceed, humans at basic customer service tasks, while operating 24/7, without taking a coffee break.
So in future many businesses can easily deploy AI-powered chatbots to reduce costs and boost customer satisfaction. Not just that, they can help take the strain from existing agents, who may be overburdened with answering straightforward questions over long hours, while a chatbot can easily do it for them at a small fraction of the cost.
It's important to note, however, that human agents will always have a role for handling more complex or empathetic enquiries, and that handover to a human will be an important mechanism in most contact centre environments.
As chatbot creation becomes more accessible, companies of all sizes can implement virtual agents - no army of developers is required. Where chatbots used to disappoint, they can now delight.
It’s time to think differently about chatbots - the future now looks bright for this new breed of friendly, helpful AI assistants.