Visa Acceptance Solutions 2024 Global Digital Shopping Index UAE Edition May 2024 Banner

Google Cloud, Hugging Face Partner on AI Development

Months after investing in Hugging FaceGoogle has launched a partnership with the artificial intelligence (AI) firm.

The collaboration will let developers use Google Cloud infrastructure for all Hugging Face services, while also allowing for the training of Hugging Face AI models on Google Cloud, the companies announced in a news release Thursday (Jan. 25). 

“The partnership advances Hugging Face’s mission to democratize AI and furthers Google Cloud’s support for open source AI ecosystem development,” the release said. “Developers will be able to easily utilize Google Cloud’s AI-optimized infrastructure including compute, tensor processing units (TPUs), and graphics processing units (GPUs) to train and serve open models and build new generative AI applications.”

According to the release, the partnership will give developers a way to “train, tune, and serve” Hugging Face models with Google Cloud’s Vertex AI to build new generative AI applications.

The collaboration also supports Google Kubernetes Engine (GKE) deployments, so developers on Hugging Face can also train their models with “do it yourself” infrastructure and scale using Hugging Face-specific Deep Learning Containers on GKE, the companies added.

Google was one of several big tech companies to invest in Hugging Face’s $235 million funding round in August of last year, valuing the company at $4.5 billion.

The partnership is happening as Google and its cloud rivals Microsoft and Amazon are increasing their cloud spending in anticipation of growing AI usage.

As PYMNTS wrote last year, Google has reached “the quarter century mark in a new era of innovation demarcated by the generative AI capabilities and foundation models that its own research labs and teams helped spearhead.”

During an earnings call in 2023, CEO Sundar Pichai told investors that “more than half of all funded generative AI startups are Google Cloud customers.” The CEO has also said that 70% of generative AI unicorns were using Google Cloud. 

Meanwhile, PYMNTS this week published the “ABCs” of AI integration.

“The emergence of AI has brought with it an alphabet soup of acronyms, from large language models (LLMs) to recurrent neural networks (RNNs), artificial general intelligence (AGI), and beyond, making it critical for executives not to get lost in the weeds of AI’s technicalities,” that report said. “Having a working knowledge of AI’s technical language is important in understanding the growing landscape of AI solutions and identifying the best fit for particular organization-level needs.”

For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.