When customer experience or operations teams within CSPs want to adopt AI, they need to find a way to ‘sell’ it to finance executives and employees. Neither is an easy job, according to a panel of experts.
Action Week 2018: Can CSPs move AI from ‘sci-fi’ to deployment?
“I was trying to convince finance that we needed the new tools to improve customer experience,” he said. “Either they were very skeptical and suspicious saying it sounds like science fiction and is not something we want to invest our money in…or at the other end of the spectrum they say, ‘Oh wow, we can save that much money? Let’s lay off the entire contact center’. When you get that kind of reaction you have to dial it back and explain that it’s something that has to be phased in.”
“When we tried to introduce the idea, we certainly got a lot of pushback from the folks in the contact center who said, ‘This looks like a threat to our livelihood’,” Hamann said. “We tried to find ways to soften it, telling them that we were looking to offload the most mundane kinds of transactions so that they could handle more complicated customer issues.
“We even talked about using AI as a coach for contact center agents – so letting them know we weren’t getting rid of anyone but that the platform might help them resolve an issue much more quickly,” he added. “We tried a few different ways to position it as ‘robots are not coming for your job’, but certainly that is the initial reaction... ‘I need to feed my family, so please stop what you’re doing’.”
“We didn’t get far enough in our journey to prove out [the benefits],” he said. “We had some fits and starts and stops along the way, so we didn’t really get a chance to prove what we’re saying is true. In our case the skeptics kind of won…so it would be nice to have another crack at it because the technology has improved; the software has improved.”
“We may have very few CSP success stories that we are sharing, but there are companies out there that have AI in their DNA – for example, Alexa and Google Home. Those companies have embedded [AI] in their culture.”
“You’ve got to prove that it works,” Windstream’s Bartels said. “Your leaders need to believe in it and drive it, and you have to realize you’re not going to be successful every time.”
“You have a hypothesis and try to prove or disprove it,” he explained. “Whether you use AI to prevent network failures, help customers or create marketing offers, you have a control group where you’re not using AI and a group where you are using AI and see which one is better.”
He adds: “You need to try to be as dispassionate as possible. That way it’s a transparent process. You’re not cheerleading AI. You’re not trying to hype it or promote it. You’re just experimenting.”
“A lot of organizations don’t have a definition that’s clear for everyone about what trust means,” she said. “Sometimes data is too guarded, but on the other end of the spectrum you need to have governance set up to make trust possible. The bottom line is that in order to trust AI systems, we first need to come to define what that trust definition is.”
“You have to have confidence in AI systems – that’s more of a technical trust,” she said. “The second is people trust – senior management and employees have to trust in AI.”