Advances in artificial intelligence (AI), combined with voice recognition and machine learning, are leading technology to not only be able to learn how to make choices but also interpret a person’s mood.
This gives us the potential to provide customers with choices far more simply and quickly than we can at the moment. This is clearly not only good for the customer but it is also good for the efficiency and timeliness of the customer service operation within an operator.
“This ability to understand mood means that if, for instance, a customer contacts his operator about his bill, and is clearly angry, an appropriate response can be delivered, quickly,” says Karthik Balakrishnan, of Amdocs. “This might entail a coupon, credit or simply a clear explanation and the problem is solved. If the customer is more responsive then the system might provide ideas for upselling opportunities, for instance a better plan that protects the customer against overage. The step beyond Artificial Intelligence is ‘designed’ intelligence and this is its application.”
At the heart of this lies voice control. We are increasingly interacting with systems through voice commands. In fact, research shows that by 2020, 85 percent of applications (mostly wearables), will have embedded speech recognition and neuro-linguistic programming (NLP) capabilities. The driver for this is that many devices simply do not have the real estate to have keyboards. Voice is the input mechanism. And voice can also give us important clues about the customer’s mood.
Segmentation of one
“This means that we can get to segmentation of one, and actually achieve personalization. And what is exciting is that it is driven by the customer not by the operator,” says Karthik.
The group that is setting out to prove this concept is the Sentimental Applications Catalyst and it is championed by Bell Canada, widely acknowledged as one of the most forward-thinking operators in the world. The participants are Amdocs, BeyondVerbal, CallVU and Microsoft.
This project is heavily based on TM Forum’s Open APIs. The group envisions a plethora of wearables that will interact with an AI engine and invoke Open APIs to fulfil a user’s request. As such, the Catalyst will both draw on the Open API Program and contribute to it
This is an exciting project in itself, but is also an evolution in software engineering, where developers can create capabilities and AI will power application flows, based on an understanding of a user’s needs and sentiment.
The Catalyst will be on show at the TM Forum Live! Event (Nice, May 15-18) and could just show how operators can indeed embark on a sentimental journey with their customers.
Watch this video for more information: