Generative AI races toward $1.3 trillion in revenue by 2032

This analysis is by Bloomberg Intelligence Senior Industry Analysts Mandeep Singh and Anurag Rana, with contributing analysis by Nishant Chintala, Charles Shum, and Steven Tseng. It appeared first on the Bloomberg Terminal.

Generative AI is poised to be a $1.3 trillion market by 2032 as it boosts sales for the tech industry’s hardware, software, services, ads and gaming segments at a compound annual rate of roughly 43%, according to our proprietary market-sizing model. Meta, Nvidia, Microsoft, Alphabet and Amazon.com stand to be at the center of training for large language models.

Gain instant access to 500+ enterprise datasets that can give you a competitive advantage.

Access the data.

Headed to 10-12% Share of Total Tech Spending

Generative AI could expand to 10-12% of total information-technology hardware, software, services, advertising and gaming expenditures by 2032 from less than 1% today, according to our analysis. Training of AI platforms (creating a machine-learning model using large datasets) will be key, driven initially by spending on servers and storage and eventually by cloud-related infrastructure. Mainstream adoption may speed the refresh cycle for PCs and smartphones, especially as more compact models like Google’s Gemini Nano are built for edge devices.

New verticals may emerge in software and gaming, including specialized AI assistants, drug-discovery software and virtual goods. There could be additional opportunities that arise as the technology evolves.

Gen Ai spending

Training market on pace to reach $470 billion

The training market will likely grow faster than inference in the near term and could reach $470 billion by 2032. The use of semiconductor accelerators should broaden as more companies ramp up investments in building their own large-language models (LLMs) similar to Meta’s Llama, Alphabet’s Gemini, and OpenAI’s ChatGPT. Servers and storage may be the most prominent segments in the short run, as companies build out AI infrastructure to handle increased computational requirements. Eventually, most companies may look to the public cloud to deploy their generative-AI workloads.

Hyperscale suppliers including Meta, Microsoft, Alphabet, Nvidia and Amazon.com will likely be among the main facilitators for training LLMs.

Generative AI as a Service

Hardware market poised to hit $640 billion

Computer vision and conversational AI products will emerge as new categories for inferencing, given the availability of large-language models for domain-specific predictions. That may accelerate growth in the $1 trillion devices market, where smart speakers and wearables tethered to phones already are a large category. An AI training infrastructure will be essential to run these heavy workloads, creating demand for high-capacity servers and storage. AI enhancements could speed up refresh cycles for PCs, smartphones and other devices.

AI-related hardware could reach $640 billion by 2032 from less than $40 billion in 2022. Microsoft, Apple, Alphabet, Nvidia and Amazon.com are most exposed to the opportunity.

Training, Inference Revenue

Software sales to grow by $318 billion

Rising demand for generative-AI products could add about $318 billion in software spending by 2032, growing 71% a year from 2022. Cybersecurity, drug discovery, AI assistants and coding workflow are among the software categories most likely to drive the additional outlays. We expect most software products will include an AI assistant to enhance user productivity, and that spending on infrastructure will outpace that for applications.

The biggest benefit could come in software coding, possibly easing pressure from the persistent shortage of developers. Based on OpenAI, Microsoft’s GitHub Copilot (priced at $19 a user per month for businesses) is one product that provides recommendations to developers, substantially reducing coding time.

Recommended for you

Request a Demo

Bloomberg quickly and accurately delivers business and financial information, news and insight around the world. Now, let us do that for you.