BT's Colin Bannon on reinventing network services
Colin Bannon, CTO of BT Business, discusses AI’s impact on the network and why he believes the telco edge is a ‘Goldilocks location’ for AI.
BT's Colin Bannon on reinventing network services
When BT Group consolidated its Global and Enterprise divisions into a single entity under the BT Business brand last year, the company said it would develop “next generation connectivity and unified communications, multi-cloud networking, and advanced security solutions” – all of which sounds like regular telco fare. But the company’s new network-as-a-service (NaaS) platform, BT Global Fabric, is anything but ordinary as it provides the foundation for BT to radically change how it sells cloud connectivity to enterprises.
In a wide-ranging interview with Inform, Colin Bannon, CTO of BT Business, discussed the impetus behind BT’s NaaS transformation, how cloud and AI technologies are impacting network architecture, and why he believes AI is finally creating a business case for telco network edge services.
Bannon also issued a call to arms for telcos to better articulate the value of the network.
“We have been astonishingly bad at marketing the importance of network, whereas our colleagues in the software world, in the cloud world – the hyperscalers – have done a remarkable job of defining reality through their lenses,” he said.
“This silence of the telcos needs to end,” Bannon added. “A good quality, deterministic network is a prerequisite for cloud to work. And we’ve forgotten that.”
NaaS plus AI
BT’s NaaS transformation is two-pronged. First, it’s an internal transformation that gives BT “a set of very fine-grained controls internally for us to do our own networking,” according to Bannon.
“However, now that you have centralized path computation and orchestration and some northbound APIs at the layer of abstraction above, you can start to create a language that you can use to translate business intent into meaningful action on the network,” he added.
“It really enriches the network more than ever before, from a dumb pipe to something that is capable of fulfilling your intent,” Bannon said.
AI enhances these capabilities significantly. From a customer service and assurance perspective, for example, being able to interrogate a firewall’s data using natural language questions can help with identifying and remediating faults.
“Context is king with AI,” Bannon said. “You need to know what questions to ask AI to get the best out of AI. And we are certainly very comfortable that we know most of the good questions about how to do networking very, very well and make it resilient, sustainable, strategic and sovereign. You get deterministic outcomes from that, and great experience.”
Indeed, Bannon sees potential to use LLM in the network to help CIOs understand what’s happening via an app. “I want to talk to my packets and network infrastructure and have them tell me what’s wrong,” he said.
BT’s supplier partners are also adding AI to their platforms that deal with a lot of data, Bannon noted.
“That’s less about LLM and more about neural modeling and neural learning – the generative side, where you’re taking a lot of data that doesn’t necessarily have KPIs and trying to baseline what good looks like and then detect anomalies, or take traffic flows and optimize them,” he explained. “So, there is optimization, prediction of future failure, and at some point we’ll be getting to auto remediation.”
AI’s unknown impact
However, like Usman Javaid and Bruno Zerbib from Orange who suggest that telcos need to rethink network architecture because of AI, Bannon highlights the unknowns when it comes to how AI computation will impact the network itself.
“Personally, as somebody who’s in the business of moving packets around as a service provider, I’m thrilled to have AI, because of course it drives more packet movement,” Bannon said. “We already see indications on local area networks, including our own, that there is an uptick of network traffic for things like Copilot.
“What we haven’t been able to do is fully quantify that,” Bannon added. “That's one of the biggest hidden questions that are running behind the scenes over the next two years: When this all settles and we get end users using these tools…and people have got their LLMs working and they’ve started to privately curate their own data set that is meaningful to their business…what will the traffic volume be? And what are the implications around sensitivity to latency, etc.?
Bannon posits that even a relatively small increase in traffic of 10% per user will trigger “significant corporate refresh cycles” for telcos and enterprises, alike.
“Some of this AI data is payload heavy and some of it’s quite light, but either way an increase of only 5 or 10% can be material when you aggregate it up,” he explained. “If there’s another 10% in one year of traffic in a country’s backbone and then you have a football game – the Superbowl or something like that – and it’s all streaming and then you get a Fortnight patch, all of a sudden as a national carrier, it starts to get interesting.
“So, that’s something we and other telcos need to get a grip on,” Bannon added. “But to be fair, Microsoft can’t even tell us, and Nvidia can’t even tell us yet. It’s still: watch this space.”
Edge: a prime location
So far, the edge-computing market has lingered in the trough of disillusionment. But Bannon believes that the rising amount of AI traffic strengthens telcos’ position at the edge of the network.
“The current working thesis is that you do training centrally, but your inference models may be on edge compute that sits deeper in the network," he explained, pointing to a shift in data processing locations. "My personal belief is that the Goldilocks location for this is at the service provider edge, because you can handle both the mobile and the fixed in those locations.”
Bannon believes Apple’s announcement in June that it will integrate OpenAI’s ChatGPT technology into its iPhones could increase AI traffic at the telco edge. Chipmaker Nvidia envisions this, too, and is courting telcos, hoping to get them to buy a new superchip to support AI applications at the edge and in the radio access network (RAN). For its part, however, Apple contends that most AI inference processing can be handled on its devices.
“Inference models can be quite portable, or compactable,” Bannon acknowledged. “However, there will be other aspects of AI that won't necessarily fit on the phones… If you’re going to replace Siri with something that understands the last five or ten interactions you’ve had – contextually understands so that you’re not starting from scratch with every instruction – that’s not necessarily processing and storage that will happen on the phone. That will drive interrogation back to the mothership, or to the edge, and that will drive additional network traffic.”
Speaking up
In order to exploit the potential of AI at the edge, CSPs need to become better storytellers, according to Bannon.
“The challenge for telcos is that there is a disconnect today and a real problem that we have sleep-walked into, and that is the lack of appreciation of the value of the network,” he said, explaining that enterprises often treat network services as a cost center, not a critical tool for their business.
“They might be paying $10,000 per month for a circuit, but the business transactions they’re doing over it are $10 billion,” Bannon pointed out.
“We are we are being treated as a volume, commodity item. And we need to figure out a way to break out of that and get back to the value conversation,” he said.