In most cases, we trust technology to do its literal job. Every time you make a phone call or turn on your television, you expect that the technology will respond accordingly. You do A; therefore, the technology will do B.
But this trust is often missing in interactions with conversational technologies.
I remember using a voice IVR for the first time years ago. A robotic voice greeted me with, “Please tell me in a few words why you’re calling.”
I froze up. I didn’t know which few words would be the right few words. I said something that the IVR didn’t understand, and then I felt anxious because I didn’t know what to do. When I ventured to try something, I was disappointed with the outcome.
My lasting impression of conversational technologies was a feeling of anxiety followed by a feeling of disappointment.
At PolyAI, we work with many customers with various business goals like reducing call wait time or abandon rate, improving CSAT scores, surfacing data to drive operational changes, and optimizing revenue generation.
Whatever goal you have in mind when deploying automated systems, you must overcome that default feeling of anxiety or disappointment.
What do you do when you need automation to serve your customers, but your customers don’t trust the automated solutions you have to offer?
Developing a successful conversational assistant starts with acknowledging that customers, by default, do not trust automated solutions. The challenge is to turn that feeling of anxiety and disappointment into trust.
What is a customer-led conversational assistant?
Building trust begins with allowing customers to speak however they like. Instead of the conversational assistant saying, ‘Tell me in a few words why you’re calling,’ it should ask an open question like, ‘How can I help?’
This gives customers the freedom to explain why they’re calling in their own words. Some customers will be skeptical and will give short answers, guessing at keywords they think the conversational assistant will be able to understand. Others will tell long stories in a lot of detail.
When a conversational assistant asks an open question, it must understand the response, whatever form it comes in, and then give a helpful answer. Trust is built as the customer gains confidence that the assistant is providing the best possible outcomes at every turn of the conversation.
We often hear customers who begin conversations sounding skeptical, giving robotic answers in keywords. But as the conversational assistant proves value by accurately understanding the caller and giving helpful responses, the customer opens up, trusting that the assistant can handle more detail and answer more complex questions.
The more things the conversational assistant gets right, the more customers trust its ability to solve their problems, and the more they engage, instead of requesting to speak to an agent.
Case study: Tracking orders with a global delivery company
One of PolyAI’s customers is a household-name delivery company that delivers sensitive documents on behalf of another organization. When the first COVID lockdown ended, the company was inundated with calls from customers who wanted to track their deliveries. The call center could not keep up with demand, and customers had to wait upwards of 30 minutes to speak to an agent.
Because of the sensitive nature of the documents, the rules around their delivery are strict. But for customers, getting hold of the documents is urgent. So they phone up to speak with an agent to try to negotiate a specific delivery slot or requirement that is outside of the company’s rules.
The conversational assistant needed to handle these negotiations in a way that enabled customers to trust that they were receiving the best possible outcome. Otherwise, the customer would insist on speaking to an agent, and the benefits of automation would be offset.
The conversational assistant has now been live for six months, and from day 1 it’s been able to hold conversations with customers over dozens of turns for as long as ten minutes. While customers can be transferred to an agent should they wish to, the vast majority are content with the response from the conversational assistant. They may not have gotten exactly what they wanted, but they are confident that the conversational assistant has given the best outcome possible.
A realistic brand voice drives trust
Another PolyAI customer receives a significant amount of queries about the required course of action following the death of a loved one. It’s a sensitive situation that needs to be handled with care. While conversational assistants aren’t capable of feeling empathy, they can respond in a tone appropriate to the situation. In this instance, the conversational assistant’s tone becomes gentle and compassionate, instead of its usually upbeat and friendly brand voice.
Enduring customer relationships begin with trust. That won’t change in the age of AI. The advancement of conversational technologies isn’t to automate conversations, but to build customer relationships anchored in trust.
If you’d like to learn more, contact PolyAI today.