logo_header
  • Topics
  • Research & Analysis
  • Features & Opinion
  • Webinars & Podcasts
  • Videos
  • Event videos

Who is who in the GenAI value chain?

Mark NewmanMark Newman
11 Feb 2024
Who is who in the GenAI value chain?

Who is who in the GenAI value chain?

When it comes to building commercial models a GenAI value chain is still emerging. That value chain – and the partnerships it is based on – is likely to change over time as new companies build their own LLMs and as successful GenAI applications developers seek to extend their roles to improve their own products and generate new sources of revenue, as we explore in this extract from the report Generative AI: Operators take their first steps.

For the time being, the GenAI value chain is dominated by hyperscalers. They are developing multiple LLMs, some generalist and some specialist, and embedding GenAI-enabled services in their enterprise platforms. Perhaps most importantly, they have the deepest pockets when it comes to investing in building and training LLMs and the computing power that supports them – as well as investing in GenAI start-ups and fledgling companies.

Infrastructure

Gen Ai value chain

The machine learning algorithms used to create text, images and audio through GenAI require vast amounts of data and fast memory to run effectively. This GenAI infrastructure space is dominated by semiconductor giants such as Nvidia, Intel and AMD, as well as the hyperscalers, some of which are developing their own chips (among others, Google, Amazon, Microsoft, Meta, Apple and Baidu are all developing chips).

When it comes to building commercial models a GenAI value chain is still emerging. That value chain – and the partnerships it is based on – is likely to change over time as new companies build their own LLMs and as successful GenAI applications developers seek to extend their roles to improve their own products and generate new sources of revenue.

According to data from Omdia’s Competitive Landscape Tracker, the semiconductor industry recorded a revenue increase in the second quarter of 2023 after five straight quarters of decline. Omdia says this growth is mainly attributed to the “explosion in the AI segment, led by GenAI”. “The data processing segment, driven by AI chips into the server space, grew 15% quarter-overquarter and makes up nearly one-third of semiconductor revenue (31% in 2Q23),” Omdia notes. The research firm says quarterly revenue grew 3.8% to US$124.3 billion in the same period.

Cloud platforms

Hyperscalers develop the platforms that provide access to computer hardware that drives GenAI services. The three largest platform providers are Amazon Web Services (AWS), Microsoft Azure and Google Cloud. But other platform companies which can be expected to play a leading role in the telecoms industry’s adoption of GenAI include Salesforce, Oracle, IBM and ServiceNow.

The sheer scale of investment needed to rapidly build and scale GenAI products makes it extremely difficult for other companies to compete. That said, a huge amount of private equity has flowed into start-up GenAI businesses. According to The Wall Street Journal, investment in GenAI start-ups globally totalled $3.9 billion in 2022, rising sharply to $17.8 billion in the first nine months of this year ($10 billion of that by Microsoft into OpenAI, $270 million into Cohere’s AI platform, and $100 million by SK Telecom into Anthropic).

One of the leading developers of LLMs for GenAI, AI21 Labs, has received $336 million in funding from companies including Intel, Comcast, Google and Nvidia. The immediate opportunity for hyperscalers to monetize their investments in GenAI is in enriching their existing products and services. For example, they could use copilot services to bolster sales of existing productivity tools. This would command an additional per-user fee. By opening copilots to developers, hyperscalers will also drive greater usage of their core cloud computing services.

Foundation models

This is the most dynamic part of the GenAI value chain. While hyperscale service providers are extremely active in building LLMs, many other companies – from start-ups to mature technology organizations – are seeking to build their own. Some of the most prominent companies here include:

  • OpenAI – the company that owns ChatGPT and in which Microsoft has invested an estimated $13 billion.
  • Anthropic, set up in 2021 by former OpenAI employees and which has developed Claude 2, a rival chatbot to ChatGPT. Amazon is reportedly investing up to $4 billion in the company and Google recently agreed to invest $2 billion.
  • Stability AI, a company set up in 2020 which describes itself as “the world’s leading opensource generative AI company”. It has created the Stable Diffusion text-to-image model and Stable LLMs and has a partnership with AWS to run AI models on its Sagemaker machine learning platform.
  • Cohere, an enterprise-focused GenAI firm set up by ex-Google Brain employees to deliver “language AI” (LLM) applications including chatbots, search engines and copywriting. It has key partnerships with Amazon, Google Cloud, Oracle and Salesforce.

New LLM initiatives are announced on a weekly basis, and the sector is trending towards more specialist LLMs which are cheaper to create / train and geared towards specific market segments and geographies.

Machine learning models

Generative AI models can be complex and expensive to train because of the computing power and resources needed. This makes it difficult to scale them to production environments. Machine learning operations (MLOps) helps to streamline the process of taking machine learning models to production, and then maintaining and monitoring them. It can help to scale AI models by automating the training and deployment process, and makes use of a range of tools to curate, host, fine-tune and manage foundation models. Applications and services We are only at the very early stages of products being developed for consumer and B2B markets that use GenAI either off-the-shelf or with a degree of fine-tuning. While the barriers to entry for companies seeking to enter the GenAI infrastructure, cloud platform or foundation model parts of the value chain are significant – because of the sheer level of investment required – the GenAI applications space is wide open. Specialist knowledge in GenAI is already a valuable asset, and professional services firms are developing a range of services and solutions designed to help enterprises leverage the GenAI opportunity, including systems integration, training and professional services. Telecoms and GenAI So where does the telecoms industry fit into the GenAI value chain?

What is clear is that both telecoms operators and their technology partners see huge opportunities. Analyst companies are also bullish about monetization prospects. McKinsey, for example, has identified 63 generative AI use cases spanning 16 business functions that it says could deliver total value in the range of $2.6 trillion to $4.4 trillion in economic benefits annually when applied across industries.

CSPs are already partnering with cloud platform providers – principally hyperscale service providers – and professional services firms as they take their first steps in GenAI. They are experimenting with off-the-shelf models with an expectation that they will train or fine-tune them for specific use cases. However, if operators do build their own LLMs they will need access to computer hardware in either a public or private cloud environment and build internal expertise in a range of MLOps tools.

CSPs’ technology suppliers are also seeking to embrace GenAI tools and capabilities. Given that operators’ own internal data – for example, data derived from customer interactions or from network operations – sits within the systems supplied by their operational and business support system (OSS/BSS) vendors, they will have a crucial role to play in how operators use GenAI. The approaches of three of the companies sponsoring this research report provide good examples of how vendors will use GenAI: Netcracker is seeking to insert itself between the CSP and the LLM by ensuring that the LLM only uses the most relevant data – and that which adheres to rules governing security and privacy – when, for example, it receives a prompt from a chatbot.

Tecnotree greatly increased its AI expertise with the acquisition of AI engineering platform CognitiveScale in December 2022. It is now embedding Gen-AI capabilities into its existing BSS stack. One of these is a chatbot that will enable its customers to ask questions about the Finnish company’s products and solutions.

Billing provider Aria Systems is partnering with Salesforce to offer a new AI-optimized “concept-to-care” monetization solution. This will allow CSPs to enhance products in their catalogs, integrate them into their OSS/BSS systems and offer automation and GenAI capabilities from Salesforce Einstein – the company’s AI for CRM products portfolio – to personalize customer touchpoints. In the meantime, operators are still working out how they fit into the GenAI value chain. The box and video below outline some of the early GenAI use cases being experimented with at Vodafone and Microsoft. And on the next page we outline TM Forum’s work in AI and automation, including the development of a GenAI LLM model for telecoms.