Visa The Embedded Lending Opportunity April 2024 Banner

Infrastructure Vendors Were the Market Winners in AI’s First Year

AI, artificial intelligence computer chip

The first year of the generative artificial intelligence (AI) era is winding down.

And what a year it has been. As difficult as it is to grasp the revolutionary impact AI might have on the global economy, the potential size of the innovation’s own market is even more difficult to comprehend.

After all, generative AI is poised to have a transformative impact across almost every piece of software and nearly all human endeavors.

But, one year into the technology’s commercialization, where has the most value accrued in the market so far — who is winning the AI race by establishing the best product-market fit?

The answer is a potentially surprising, if intuitive, one: the real market winners, at least so far, are the infrastructure vendors. This includes cloud platform providers like Google, Microsoft and Amazon; and GPU producers like NVIDIA, Arm and others.

That’s because when buzzy AI startups like Anthropic, Mistral, OpenAI and others raise eye-popping amounts of money from investors, the first thing they do is turn around and pay that money to infrastructure vendors in order to run the necessary compute for them to train their AI models.

After all, nearly every early industry finds itself facing an inventory and infrastructure challenge, with available resources waning as marketplace hype waxes.

That puts the B2B vendors critical to young, promising ecosystems in an enviable position.

See more: Who Will Power the GenAI Operating System?

AI Compute Doesn’t Come Cheap

A major goal of most AI firms over the past year was simply to get people to use their systems.

And while they succeeded, it was the infrastructure vendors that run training and inference workloads for generative AI models, including cloud platforms and computing hardware makers, that are likely the biggest market winners so far.

Training generative AI requires either owning or renting time on hardware, significant data storage needs and intensive energy consumption, a structural cost that sharply diverges from the unit economics of previous computing and technological booms.

Some estimates place the cost of a single query using OpenAI’s ChatGPT platform at 1,000 times that of the same question asked of a normal Google search, making the margins for AI applications significantly smaller than other software-as-a-service (SaaS) solutions.

The high cost of the computing power AI models require any firm looking to compete in the space to shell out big. Significant capital investment, industry-leading technical expertise, and above all, intensively expensive computing infrastructure built atop rows of increasingly-scarce GPUs are all needed to establish and maintain generative AI models.

China’s five largest tech firms collectively placed a $5 billion chip order this past summer, hoping to build up their own foundational architectures in order to compete with Western tech companies.

See also: Peeking Under the Hood of AI’s High-Octane Technical Needs

It Takes Money to Make Money in the AI Market

Still, Google is a leading cloud provider and AI player, as is Microsoft, and Amazon too, meaning some firms — among them the most valuable in the world — are able to double-dip among their own services while also selling key services to other companies.

As PYMNTS reported in October, Amazon Web Services revenue rose 12% year over year in the company’s most recent quarter, to $919 million, while Google’s Cloud revenue grew 22% from a year earlier, almost double the rate of growth for the company as a whole.

The generative AI industry itself is expected to grow to $1.3 trillion by 2032, and PYMNTS Intelligence finds that 84% of business leaders believe generative AI will positively impact the workforce.

The technology isn’t going anywhere anytime soon, except into more and more enterprise workflows. So, while the model providers themselves behind today’s cutting edge foundational large language models (LLMs) may not have achieved large commercial scale just yet, there is still a very attractive runway ahead of them.

And if they want to emerge as the marketplace winners and take the crown from their own venders, AI firms might want to take a page from the vendors’ book — and focus on B2B, enterprise-based value propositions.

“AI is going to be an imperative for every company, and what you do with AI is what will differentiate your products,” Heather Bellini, president and chief financial officer at InvestCloud, told PYMNTS. “Functionally, it might get rid of a lot of the manual work people don’t want to do anyway and extract them up to a level where they can do more things that have a direct impact on the business.”

For further reading on AI solutions, the PYMNTS Intelligence “Generative AI Tracker®,” a collaboration with AI-ID, sorts the myths from the realities of AI and explains how businesses can leverage AI technology wisely and effectively.