Visa The Embedded Lending Opportunity April 2024 Banner

Intel Introduces Gaudi 3 to AI Chip Market

Intel Introduces Gaudi 3 to AI Chip Market

Intel announced the release of its latest artificial intelligence chip, the Gaudi 3, as part of an industry-wide effort to develop chips capable of running AI models, such as those used in OpenAI’s ChatGPT.

The Gaudi 3 chip offers an average of 40% better power efficiency than Nvidia’s H100 GPU, according to a Tuesday (April 9) press release.

The chip is available in various configurations, including an eight-chip package on a single motherboard or as a standalone card that can be integrated into existing systems, CNBC reported Tuesday.

Intel tested the Gaudi 3 with several AI models, including Meta’s Llama and the Abu Dhabi-backed Falcon model, the release said.

The Gaudi 3 can train and deploy a range of models, from Stable Diffusion to OpenAI’s Whisper for speech recognition, while maintaining lower power consumption compared to Nvidia’s products, CNBC reported.

“Innovation is advancing at an unprecedented pace, all enabled by silicon — and every company is quickly becoming an AI company,” Intel CEO Pat Gelsinger said in the release. “Intel is bringing AI everywhere across the enterprise, from the PC to the data center to the edge.”

As Nvidia continues innovating in the AI chip market, rival chipmaker Intel is not idle. Experts believe that Nvidia’s latest chips, such as the B200 GPU and the GB200, could revolutionize the commerce industry by enabling faster and more efficient execution of AI applications.

Announced March 18 during Nvidia CEO Jensen Huang’s keynote at the company’s annual developer conference, the B200 GPU has 208 billion transistors, delivering up to 20 petaflops of FP4 processing power.

Moreover, Nvidia highlighted that the GB200, which combines two B200 GPUs with a single Grace CPU, can boost performance up to 30 times for large language model inference tasks. The GB200 is the inaugural chip in Nvidia’s Blackwell series of AI graphics processors. This configuration is also engineered to be more efficient, potentially reducing costs and energy consumption by up to 25 times compared to the previous H100 model.

“With the B200’s ability to analyze vast amounts of data, businesses can more accurately predict customer demand,” Lars Nyman, chief marketing officer at CUDO Compute, told PYMNTS in an interview last month. “This allows for better inventory management, reducing the risk of stockouts and overstocking.”

For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.