Expect GenAI to Take on Customer-Facing Roles
Large language models (LLMs), the underlying technology behind generative artificial intelligence, are helping make contact centers much more efficient. But at this point in time, they are not being used to their full potential. Today, most contact centers are using LLMs internally to help agents perform their jobs more efficiently and provide customers with faster, more comprehensive service, but they have yet to turn them around for customer-facing use cases. That could change very soon—quite possibly even in 2024, experts agree.
Advances in generative AI for customer service are developing rapidly, according to Simon Thorpe, director of global product marketing for customer service and sales automation at Pegasystems. “Large language models being able to create more humanlike interactions through the self-service channel is a partnership made in heaven. This is something that will come very quickly,” he says.
Research from Zendesk backs that up. The company, in its most recent customer experience trends report, revealed that 65 percent of business leaders believe the AI they use is becoming more natural and humanlike and it’s only going to get better.
“It’s less about when the technology will be ready and more about when organizations become ready, coupled with how broad of a problem scope genAI is given freedom to handle,” says Aaron Schroeder, director of AI solutions at TTEC Digital.
Today genAI is limited, “but that boundary keeps creeping further and further as the rapid innovation in the space continues and as companies leverage partners that specialize in AI and CX,” Schroeder says.
LLMs can be ready for much more extensive use in customer-facing applications when there is a clearly defined use case grounded in strategy, business objective alignment, adoption planning, defined measures of success, and risk mitigation/security planning, he adds. “There’s a nuance between when something is ready for use and when something is ready for value derivation. Bright and shiny object syndrome drives many organizations to only focus on the former.”
And customers are also ready for greater use of AI in customer service settings, Thorpe says, noting that most have indicated a preference for self-service for years, even as virtual assistant chatbots have been able to answer only the most basic questions. The expectation, he adds, is that generative AI-powered virtual assistants will be able to help with more detailed answers, though today that customer-facing capability is still very limited.
THE HALLUCINATION HOLDOUT
The reason genAI in customer service is limited, experts agree, is the same one that has been holding it back across other use cases: hallucinations.
Indeed, there is a concern with LLMs working more directly with customers because of these hallucinations, which occurs when the LLM perceives patterns or objects that are nonexistent or imperceptible to human observers, creating outputs that are nonsensical or altogether inaccurate.
“Striking a balance between humanlike interaction and minimizing the likelihood of generating inaccurate data becomes a critical tradeoff,” McDougal cautions. “Hallucinations, wherein the AI fabricates responses, pose a considerable risk.”
“At this time, generative AI can be dangerous for customer-facing interactions because, by design, the technology doesn’t know yet how to say: ‘I don’t know,’” explains Yan Zhang, chief operating officer of PolyAI, which builds enterprise conversational assistants that carry on natural conversations with customers to solve their problems. “Since generative AI pulls information from the public internet, the answers it puts out may be misleading, or at worst wrong.”
“While generative AI can personalize customer engagement on a massive scale, our point of view is that AI shouldn’t sit in the driver’s seat,” adds Joe Bradley, chief scientist at LivePerson, a conversational AI systems provider. “Right now, it works best as a co-pilot that helps customers get where they need to go.”
LLMs currently fail with unanticipated scenarios, untrained data, or empathy-driven situations, says Rob McDougal, CEO of Upstream Works, a provider of omnichannel contact center software. “Though generative AI enhances chatbot interfaces and self-serve [interactive voice response] scripts, it necessitates carefully constrained responses, given the challenge of exhaustive training.”
There’s also brand risk if genAI hallucinates and provides incorrect feedback on a customer-facing solution, Schroeder adds.
Companies that have already deployed genAI-powered bots in some of their operations have had problems quantifying accuracy because there is no script, making it difficult to find areas of weakness, and there is a low tolerance for the risk of the bot giving wrong answers, adds Melissa Copeland, principal and founder of Blue Orbit Consulting.
Since genAI hasn’t been deployed as part of an overall customer journey or customer experience, it’s impact on customer satisfaction and customer behavior is also untested. The high degree of interest in deploying the technology means that organizations have to define test cases to understand how it works and be able to forecast investment, risk, and return more concretely.
Though no one can say for sure when genAI-powered bots can be successfully deployed for more general customer-facing uses, Chris Johnson, CEO of Experience Dynamic, says the hallucination rate is dropping.
THE INSIDE TRACK
While experts agree that genAI’s role in customer service is currently limited to internal operations, there is a vast set of applications for the technology within that realm.
“The killer app around AI is more about helping agents do their jobs than it is around chatbots,” McDougal says.
“We’re already using generative AI to heavily guide and support agents,” Thorpe agrees. “They can summarize calls and summarize interactions. The benefits of that are exponential.”
GenAI is also being used in contact centers to develop workflows, call scripts, routing patterns, personas, and much more.
Some companies are also using generative AI to work directly with agents to provide coaching, using guidance provided by previous customer interactions, says Frank Schneider, a vice president and AI evangelist at Verint.
This coaching can include upsell recommendations that sound more natural and have less of a hard-sell tone than is typically used, according to Schneider.
Additionally, for onboarding and agent training, genAI helps provide customer interaction examples based on data generated from real-life use cases. Zendesk AI, for example, is built on billions of real customer service interactions and understands customer experience, taking the guesswork away from agents so they can focus on providing an excellent experience, according to Cristina Fonesca. vice president of product at Zendesk.
Another way contact centers are deploying genAI today is to create no-code AI assistants out of existing customer service policies and procedures, LivePerson’s Bradley points out. Such bots can quickly provide customers with detailed, natural responses. However, these bots need to be created to address specific questions and to resolve specific issues. They also must be continuously monitored by human agents.
And even though the use is limited, experts also point out that customer experience operations are already by far more advanced in deploying LLMs and genAI than other business units, having moved past the public relations and early hype phases and into the implementation stage to improve agent performance, streamline workflows, and ensure that each customer interaction is unique and personalized.
“Some of the big contact center vendors are including generative AI as part of what they are providing. Zoom is doing it. Cisco is doing it,” McDougal says. “But what they are doing is still constrained.”
BIG PLANS
Going forward, there are big expectations for how genAI will help improve customer service, especially with regard to chatbots.
Chatbots powered by genAI can speak and understand human language and use LLMs to match brand personas and share the right resources needed based on previous interactions, according to Fonseca.
Thorpe expects LLMs to eventually enable companies to train virtual assistants and chatbots to be able to provide not only responses to direct questions, but also predictions, actions, and resolutions.
“If we get to the point that customers are actually adopting this service, it will make customers happier and it will make enterprise brands happier because costs will go down,” Thorpe says. “Costs will go down because companies using the technology, once the hallucination issue is solved, will be able to handle more interactions with fewer human customer service representatives.”
And then, for contact center chatbots, generative AI represents progress from scripted models, enabling more natural conversations, according to McDougal.
Yaniv Hakim, cofounder and CEO of CommBox, expects chatbots to move from an inconvenience to an enabler once genAI advancements take hold. “Today’s chatbots often leave customers frustrated. However, advancements in generative AI are transforming chatbots into valuable tools which can provide personalized support around the clock and resolve issues instantly, leading to increased customer satisfaction and lower operational costs for businesses,” he says.
Among the advances under way that could have the biggest impact on the customer service space is retrieval-augmented generation (RAG). RAG technology enhances genAI’s ability to capture specific company information.
Imagine a future where AI bots seamlessly understand and engage with users like they were their best friends. RAG technology is the key to making this vision a reality, according to Nikola Mrkšic, co-founder and CEO of PolyAI.
“By combining the language prowess of AI with real-world knowledge, RAG revolutionizes customer interactions. It goes beyond conventional chatbots by personalizing engagements, boosting efficiency, and even predicting customer needs before they articulate them,” he explains.
“What sets RAG apart is its analytical capability. It doesn’t just chat; it proactively analyzes data, anticipating potential issues and offering support before customers even realize they need it. This proactive approach not only builds trust but also strengthens relationships by preventing problems before they arise,” Mrkšic continues.
RAG, he says further, “has the potential to be a game changer in the customer experience landscape.”
However, before deploying generative AI for external or internal uses, organizations need to start with a clear experience vision, aligned to business objectives. Then they can identify where applying generative AI makes the most sense, according to TTEC Digital’s Schroeder. “The technology enables the strategy. With that philosophy in mind, we’d recommend isolating three core areas for evaluation: people, process, and design.”
Also add data to that list. Michael Lawder, chief experience officer of ASAPP, a provider of genAI solutions for contact centers, places it at the forefront.
“To implement AI into their CX stack in 2024, enterprises must establish foundational building blocks, such as high-quality transcription, and get their data house in order before they can really take advantage of the benefits of AI,” he states.
Chris Gladwin, CEO and co-founder of Ocient, a hyperscale data analytics solution provider, agrees. “The value of AI depends on the quality of its training data, and extracting maximum value from multimillion-dollar AI investments is challenging without systems and architectures designed for massive data processing,” he maintains. “Companies that will be leading with the transformative power of AI in 2024 will be those investing in hyperscale architecture as a strategic cornerstone vs. a plug-in solution, and I expect we’ll start seeing a significant shift in the AI landscape as more and more businesses adopt this approach.”
Experience Dynamic’s Johnson suggests a simple experiment as a starting point: Load a PDF with a set of non-confidential customer service information, using OpenAI’s GPT chat builder. “Though doing so won’t create a bot ready for public use, it would provide a company with a sense of how helpful and accurate such a bot would be,” he says.
“Larger organizations with access to software development teams could take this a bit further and build a simple chatbot that uses customer service information and the OpenAI API to provide more nuanced interactions,” Johnson adds.
They could even explore integration with internal systems to deliver a more useful customer experience once the tech is finally available. But such capability is likely to take months to develop.
“In the not-to-distant future, [genAI] will be an incredible advancement that can be leveraged in direct customer service. Until then, it is worth experimenting internally with the technology to prepare your company for when it eventually arrives, as these platforms are easily accessible and inexpensive to use,” Johnson says.
“This is the time to act,” Thorpe states emphatically. “I fully expect [additional] use cases to start getting deployed very, very quickly.”
Phillip Britt is a freelance writer based in the Chicago area. He can be reached at spenterprises1@comcast.net.