Let me start by stating the obvious: Automation has its best home in the contact center. At many brands, agents make up 10% of the workforce. They are the face of your brand to your customers, providing critical empathy and solving important problems. Agents also spend a lot of time performing rote tasks for customers that really should not require a human. The cost of an automated interaction is generally about a tenth the cost of a conversation with an agent. This cost differential provides gaudy ROI numbers, prompting brands to implement chatbots and interactive voice response (IVR) systems that harm customer loyalty. Unfortunately, this focus on cost savings often undermines the overall customer experience.
Enter generative AI and LLMs: technology that will deliver natural conversations that are useful to customers but require a fraction of the time and effort to deliver when compared to traditional conversational AI toolsets.
This new AI technology poses an existential threat to conversational AI vendors. Think about the transition from an ice box to a refrigerator—a massive leap in capability and so much easier to use and maintain. Imagine being an ice harvester realizing that, with refrigeration, your industry had become obsolete (thank you Guy Kawasaki for this metaphor). The vendors in this wave are not ice harvesters. While there was no amount of retooling that an ice harvester could do to stay relevant in the era of refrigeration, conversational AI vendors have embraced generative AI and LLMs to offer much more than was dreamed possible just a year ago.
In the blink of an eye since ChatGPT was announced, the vendors in this wave have all evolved from being companies that could build an individual intent in six weeks or so using traditional NLU technology, to providing these important capabilities:
- Orchestrating various AI resources including LLMs and generative AI to ensure the right resources are used at the right time for the best possible customer experience.
- Providing guardrails such as RAG, vectoring, and specific and finely tuned LLMs to make generative AI safe for customer facing interactions.
- Helping their customers find secure spaces, such as RAG-driven frequently asked questions (FAQ) applications, to begin their generative AI self-service journey.
- Managing transaction workflows to deliver positive customer experiences while meeting business requirements.
- Future-proofing brand’s self-service offerings with flexible technology that will keep up with the madness that is generative AI.
The result of this transformation is improved chatbots and IVAs that can be delivered more cost-effectively than ever before, all thanks to the capabilities of LLMs and generative AI. This isn’t just a bolt-on; vendors in this wave have completely revamped their offerings by placing LLMs and generative AI at the core. It’s remarkable—I’ve never witnessed such rapid reinvention of an entire market space in less than a year.
If you are a contact center manager, it’s time to start studying up. The vendors had to make this change to survive. You don’t need to worry about generative AI making your business obsolete, but you can follow their lead and differentiate with a new generation of likeable, useful chatbots that provide real value to your customers and significant cost savings for your organization.
Don’t get me wrong. It’s early days for brands thinking about delivering generative AI powered chatbots or IVAs. This is still bleeding edge stuff. Approaches to mitigating hallucinations and other generative AI issues are in their infancy. However, the vendors in this wave all understood that the impact of generative AI and LLMs are so powerful that they transformed themselves to embrace these technologies. Now is the time for you to start taking advantage of what they built. It’s a new ball game and you need to play.
Click here to read the Conversational AI for Customer Service Wave.