google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
USA

OpenAI chip deal with Cerebras adds to roster of Nvidia, AMD, Broadcom

Open AI CEO Sam Altman speaks during a speaking session with SoftBank Group CEO Masayoshi Son at the “Transforming Business through Artificial Intelligence” event in Tokyo on February 3, 2025.

Tomohiro Ohsumi | Getty Images

In November, following Nvidia’s latest earnings, CEO Jensen Huang bragged to investors about his company’s position in AI, saying of the newest startup in the space: “Everything OpenAI does is running on Nvidia today.”

While it’s true that Nvidia maintains its dominant position in AI chips and is now the world’s most valuable company, competition is emerging and OpenAI is doing its best to diversify while pursuing a historically aggressive expansion plan.

On Wednesday, OpenAI announced a $10 billion deal with chipmaker Cerebras, a relatively new player in the space but eyeing the public market. It was the latest in a series of deals between OpenAI and the companies producing the processors needed to build large language models and run increasingly complex workloads.

Last year, OpenAI committed more than $1.4 trillion to infrastructure deals with companies including Nvidia. Advanced Micro Devices And broadcomIt’s on track to command a $500 billion private market valuation.

As OpenAI tries to meet expected demand for artificial intelligence technology, it has signaled to the market that it wants as much processing power as it can find. Here are the key chip deals OpenAI has signed as of January and potential partners to keep an eye out for in the future.

Nvidia

Nvidia founder and CEO Jensen Huang speaks at Nvidia Live at CES 2026 ahead of the annual Consumer Electronics Show in Las Vegas on January 5, 2026.

Patrick T. Fallon | Afp | Getty Images

OpenAI has relied on Nvidia’s graphics processing units since the early days of building large language models, long before the release of ChatGPT and the start of the generative AI boom.

In 2025, this relationship was taken to another level. Following an investment in OpenAI in late 2024, Nvidia announced in September that it would commit $100 billion to support OpenAI as it builds and deploys at least 10 gigawatts of Nvidia systems.

A gigawatt is a measure of power, and 10 gigawatts is roughly equivalent to the annual power consumption of 8 million U.S. households, according to a CNBC analysis of data from the Energy Information Administration. Huang said in September that 10 gigawatts would equal 4 to 5 million GPUs.

“This is a huge project,” Huang told CNBC at the time.

OpenAI and Nvidia said the first phase of the project is expected to come online on Nvidia’s Vera Rubin platform in the second half of this year. But during Nvidia’s quarterly earnings call in November, the company said there was “no assurance” that its deal with OpenAI would move beyond an announcement into a formal contract phase.

Nvidia’s initial investment of $10 billion will kick in once the first gigawatt is completed, and the investments will be made at then-current valuations, as CNBC previously reported.

AMD

Lisa Su, president and chief executive officer of Advanced Micro Devices Inc., showcases the AMD Instinct MI455X GPU at the 2026 CES event in Las Vegas on January 5, 2026.

Bloomberg | Bloomberg | Getty Images

In October, OpenAI announced plans to deploy its six gigawatt AMD GPUs across multiple years and multiple generations of hardware.

As part of the deal, AMD guaranteed OpenAI up to 160 million shares of AMD stock; This could amount to around a 10% stake in the company. The guarantee includes vesting milestones based on both distribution volume and AMD’s share price.

The companies said they plan to launch the first gigawatt chips in the second half of 2026, adding that the deal is worth billions of dollars but a specific amount was not disclosed.

“You need partnerships like this to really bring the ecosystem together so that, you know, we can really get the best technologies out there,” AMD CEO Lisa Su told CNBC at the time of the announcement.

Altman sowed the seeds for the deal when he took the stage with Su at the AMD launch event in San Jose, California, in June. He said OpenAI plans to use AMD’s latest chips.

broadcom

Hock Tan, CEO of Broadcom.

Lucas Jackson | Reuters

Later that month, OpenAI and Broadcom publicly announced a collaboration that has been in the works for more than a year.

Broadcom calls its custom AI chips XPU, and a few customers have relied on them so far. But the potential deals have sparked so much excitement on Wall Street that Broadcom is now valued at more than $1.6 trillion.

OpenAI said it is designing its own AI chips and systems that will be developed and distributed by Broadcom. The companies agreed to deploy 10 gigawatts of these dedicated AI accelerators.

In the October release, the companies said Broadcom will aim to begin deploying racks of AI accelerator and networking systems in the second half of this year and aim to complete the project by the end of 2029.

But Broadcom CEO Hock Tan told investors during the company’s quarterly earnings call in December that he doesn’t expect much revenue from the OpenAI partnership in 2026.

“We appreciate the fact that this is a multi-year journey that will last until 2029,” Tan said. “I call it an agreement, an alignment of where we are going.”

OpenAI and Broadcom did not disclose financial terms of the deal.

brains

Andrew Feldman, co-founder and CEO of Cerebras Systems, speaks at the Collision conference in Toronto on June 20, 2024.

Ramsay Cardy | Sports file | Collision | Getty Images

OpenAI on Wednesday It announced a deal to distribute 750 megawatts of Cerebras artificial intelligence chips that will come online in multiple slices by 2028.

Cerebras produces large wafer-scale chips that can respond up to 15 times faster than GPU-based systems. a version. The company is much smaller than Nvidia, AMD and Broadcom.

OpenAI’s deal with Cerebras is valued at more than $10 billion, which could be a boon for the chipmaker because it weighs in on a potential exit in the public markets.

“We are delighted to partner with OpenAI, bringing the world’s leading AI models to the world’s fastest AI processor,” Cerebras CEO Andrew Feldman said in a statement.

Cerebras urgently needs important customers. The company in October retreated is planning an IPO, days after announcing it had raised more than $1 billion in a fundraising round. It had filed for an IPO a year ago, but the prospectus revealed a heavy reliance on a single client in the United Arab Emirates, Microsoft-backed G42, which is also an investor in Cerebras.

Potential partners

OpenAI CEO Sam Altman speaks during a media tour of the Stargate data center on September 23, 2025 in Abilene, Texas. Stargate is a collaboration between OpenAI, Oracle and SoftBank, with promotional support from President Donald Trump, to build data centers and other infrastructure for artificial intelligence across the United States.

Kyle Grillot | Bloomberg | Getty Images

where is this going Amazon, Google and Intel all have their own AI chip games?

In November, OpenAI signed a $38 billion cloud deal. Amazon Web Services. OpenAI will run workloads through existing AWS data centers, but the cloud provider also plans to build additional infrastructure for the startup as part of the deal.

Amazon is also in talks to potentially invest more than $10 billion in OpenAI, as CNBC previously reported.

OpenAI may decide to use Amazon’s AI chips as part of these investment discussions, but nothing has been officially determined, according to a person with knowledge of the matter who asked not to be named because the discussions are confidential.

AWS announced Inferentia chips in 2018 and the latest generation Trainium chips in late 2025.

Google Cloud it also provides computing capacity to OpenAI, thanks to a deal that quietly closed last year. But OpenAI in question In June, it announced that Google had no plans to use its in-house chips, called tensor processing units, which Broadcom also helped produce.

Intel has been the biggest AI laggard among traditional chipmakers, which explains why the company has recently received major investments from the US government and Nvidia. Reuters reported Intel had a chance to invest in OpenAI years ago and potentially make hardware for the then-fledgling startup, offering a way to avoid relying on Nvidia, he said in 2024, citing people with knowledge of the discussions.

According to Reuters, Intel opposed the deal.

Intel in October announced A new data center GPU codenamed Crescent Island is “designed to meet the increasing demands of AI inference workloads and will deliver high memory capacity and energy-efficient performance.” The company said “customer sampling” is expected in the second half of 2026.

Wall Street will hear updates on Intel’s latest AI work when the company kicks off tech earnings season next week.

— CNBC’s Kif Leswing, MacKenzie Sigalos and Jordan Novet contributed to this report.

WRISTWATCH: Breaking down AI chips, from Nvidia GPUs to Google and Amazon’s ASICs

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button