Will telcos profit from AI?
Nvidia thinks so, but power is an issue.
Will telcos profit from AI?
The AI hype has been so loud it has risked drowning out practical discussions, such as how communications service providers (CSPs) can use AI to both grow their businesses and make their operations far more efficient, while meeting their sustainability objectives.
Now the mood is shifting, and the telecoms industry is evaluating what is attainable with AI. Chris Penrose, Global VP of Business Development with Nvidia, provided one potential answer as day two of TM Forum Innovate Americas 2024 kicked off – build networks that carry new AI traffic in the RAN in addition to voice, data, and video traffic.
Creating the AI-RAN
One of Nvidia’s recently announced goals is to “make the AI-RAN a reality.” To that end, Penrose said, the company has developed a platform designed to enable telcos to accelerate their RANs by running them over Nvidia GPU infrastructure. Penrose said that CSPs can “plug into this architecture” which provides “building blocks to help the ecosystem bring innovation to the marketplace.”
The platform, Penrose said, is not only all “open, AI-native, and software-defined” but also designed so that CSPs will also be able to run 6G infrastructure over it and can operate in public or private cloud environments. Penrose said that CSPs can go full speed now in addressing network operations and how they manage and optimize AI traffic On the AI-RAN side of things, and particularly for 6G, he admitted work is ongoing with major RAN vendors like Nokia, but said CSPs can use, test, and learn with the platform today.
Many CSPs are now “leaning into AI to drive operational efficiency,” Penrose explained. He encouraged the CSPs to start there, honing expertise in the platform with internal workloads and applications, and then move to bring services to the diverse industries and governments that CSPs serve.
Ultimately, however, he said the purpose of a platform like the one Nvidia has developed is to help CSPs find new ways to grow their businesses in the face of declining revenues in traditional services and heavy competition in mobile: “Carrying new AI traffic in addition to RAN voice, video, and data traffic is a massive opportunity for telcos.”
What’s the catch?
The burning question is whether Penrose is right. Logically, connectivity is necessary to move AI traffic from place to place. CSPs are well positioned, if not best positioned, to provide it. But the real fly in the ointment may have nothing to do with telecoms at all, but rather with energy consumption.
Recent research from IDC finds that “electricity is by far the largest ongoing expense for datacenter operators, accounting for 46% of total spending for enterprise datacenters and 60% for service provider datacenters.”
AI workloads are energy-intensive and demand for them is rising across industry sectors. As a result, IDC expects AI to contribute to a significant increase in datacenter capacity, energy consumption, and carbon emissions.
The technology research company projects AI datacenter capacity will have a compound annual growth rate (CAGR) of 40.5% through 2027 and that the CAGR for AI datacenter energy consumption will be 44.7%, reaching 146.2 Terawatt hours (TWh) by 2027.
Overall, IDC expects “global datacenter electricity consumption to more than double between 2023 and 2028 with a five-year CAGR of 19.5% and reaching 857 Terawatt hours (TWh) in 2028.”
In comparison in 2019, the world’s total electricity final consumption was 22 848 TWh, according to the International Energy Agency.
Certainly, some suppliers report that demand for AI workloads is not just fierce, but pent up. An expert from a key technology supplier, speaking on background, noted that while their company has many GPUs reserved with Nvidia into the future, there is not yet enough power to light them all up. This is a challenge the entire AI industry faces, as energy demand from AI growth outstrips energy supply.
Got the power?
Throughout the first day of the conference many speakers expressed the view that AI traffic is growing faster than it can be either powered or transported. AI has driven double-digit growth in the data center industry, and Nvidia argues it will drive massive growth for CSPs. But with new data centers, particularly in the US, being constructed in rural and remote areas where land is cheap, but access to the power grid needs to be built out, there will be a lag.
The hype around AI remains, but reality has now entered the conversation. Even if the opportunities for CSPs appear real, cross-industry collaboration that includes CSPs, hyperscalers and sustainable energy providers will be necessary to keep the AI revolution and its many new business opportunities rolling forward.