OpenAI’s latest megadeals are testing its hyperscaler ambitions

Sam Altman didn’t set out to compete Nvidia.
OpenAI started with a simple bet that better ideas, not better infrastructure, would unlock artificial general intelligence. But this view changed years ago when Altman realized that more computing, or processing power, meant more capability and, ultimately, more dominance.
On Monday morning, it announced its latest blockbuster deal, which puts OpenAI squarely into the chip-making business and into competition with hyperscalers.
Partnering with OpenAI broadcom co-developing custom AI accelerator racks designed specifically for their models. It’s a big shift for a company that once believed intelligence would come from smarter algorithms, not bigger machines.
“What we found in 2017 was that we got the best results at scale,” the OpenAI CEO said on a company podcast Monday. “This wasn’t something we set out to prove. It was something we discovered experimentally because everything else nearly didn’t work.”
This insight that scale, not smartness, is the key has fundamentally reshaped OpenAI.
The company is now extending this logic further by working with Broadcom to design and deploy custom silicon racks optimized for OpenAI’s workloads.
The deal gives OpenAI deeper control over its stack, from training edge models to infrastructure, distribution, and owning the developer ecosystem that turns these models into persistent platforms.
Altman’s rapid deals and product launches are creating an entire AI ecosystem. Apple I made it for smartphones and Microsoft It was made for PCs with infrastructure, hardware and developers at its core.

Hardware
Through its partnership with Broadcom, OpenAI is co-developing custom AI accelerators optimized for inference and specifically tailored to its models.
Different Nvidia And AMD Besides chips designed for broader commercial use, the new silicon is built for vertically integrated systems; tightly coupled compute, memory, and networking into a full rack-level infrastructure. OpenAI plans to start deploying them in late 2026.
The Broadcom deal is similar to what Apple did with its M-series chips: control the semiconductors, control the experience.
But OpenAI goes further and designs not just the chip, but every layer of the hardware stack.
Broadcom systems are built on the Ethernet stack and are designed to accelerate OpenAI’s core workloads; This gives the company a physical advantage that is deeply intertwined with the software advantage.
At the same time, OpenAI is also getting into consumer hardware, a rare move for a model-first company.
The $6.4 billion share purchase of Jony Ive’s startup io added the legendary Apple designer to its inner circle. This was a sign that OpenAI didn’t just want to power AI experiences, it wanted to own them.
Ive and his team are exploring a new class of AI native devices designed to reshape how humans interact with intelligence by moving beyond screens and keyboards toward more intuitive, engaging experiences.
Reports One early concept involves a screen-less, wearable device using voice input and subtle touches, designed more as an ambient companion than a traditional device.
OpenAI’s twin bets on custom silicon and emotionally resonant consumer hardware add two more powerful branches over which it has direct control.

Blockbuster deals
OpenAI’s chips, data centers, and power combine into a single coordinated campaign called Stargate, providing the physical backbone of AI.
Over the past three weeks this campaign has gone into overdrive with several major deals:
- OpenAI and Nvidia have agreed on a framework for deploying 10 gigawatt Nvidia systems, backed by a proposed $100 billion investment.
- AMD to supply multiple generations of OpenAI Instinct GPUs are under a 6 gigawatt deal. OpenAI could acquire up to 10% of AMD if certain distribution milestones are met.
- Broadcom’s custom inference chips and racks are scheduled to begin deployment in late 2026 as part of Stargate’s initial 10-gigawatt phase.
Taken together, this is OpenAI’s effort to embed the future of AI into infrastructure it can call its own.
“We can think from etching the transistors to the token that appears when you ask ChatGPT a question and design the entire system,” Altman said. “We can make huge gains in efficiency, and that will lead to much better performance, faster models, cheaper models, all of that.”
Whether or not OpenAI delivers on every promise, Stargate’s scale and speed are already reshaping the market, adding hundreds of billions of dollars of market value for its partners and making OpenAI the de facto market leader in AI infrastructure.
None of their rivals seem to be able to keep up with this pace or ambition. And this perception alone provides a powerful advantage.
Developers
OpenAI’s DevDay made it clear that the company isn’t just focused on building the best models, it’s also investing in the people who develop with them.
“OpenAI is trying to compete on many fronts,” said DA Davidson Head of Technology Research Gil Luria, pointing to its edge model, consumer-facing chat product and enterprise API platform. “It competes with some combination of all the major technology companies in one or more of these markets.”
Developer Day aims to help companies incorporate OpenAI models into their tools, he said.
“The tools they offered were very impressive – OpenAI has been fantastic at commercializing their products in an engaging and easy-to-use way,” he added. “However, they face an uphill battle because the companies they compete with have significantly more resources – at least for now.”
Luria said the main competition is primarily Microsoft Azure, AWS and Google Cloud.
Developer Day signaled how aggressively OpenAI is leaning in.
The company launched Agent Kit new API packages for developers, new API packages for companies, and a new App Store offering direct distribution within ChatGPT, which now has 800 million weekly active users, according to OpenAI.
“This is Apple’s playbook: own the ecosystem and become a platform,” said Menlo Ventures partner Deedy Das.

Until now, most companies viewed OpenAI as a tool in their stack. But with new features to publish, monetize and distribute apps directly on ChatGPT, OpenAI is pushing for tighter integration and making it harder for developers to leave.
Microsoft CEO Satya Nadella followed a similar strategy after taking over from Steve Ballmer.
Nadella turned to open source to gain the trust of developers and Acquired GitHub for $7.5 billionIt’s a move that signals Microsoft’s return to the developer community.
GitHub later became the launching pad for tools like Copilot, putting Microsoft back at the center of the modern developer stack.
“OpenAI and all the major hyperscalers are moving towards vertical integration,” said Ben Van Roo, CEO of Legion Intelligence, a startup that builds secure broker frameworks for defense and intelligence use cases.
“Use our models and compute and create next-generation agents and workflows with our tools. The market is huge. We’re talking about replacing SaaS, large systems of record, and literally part of the workforce,” Van Roo said.
SaaS stands for software as a service, a group of companies that specialize in enterprise software and services, of which Salesforce, Oracle, and Adobe are also part.
Legion’s strategy is to remain model agnostic and focus on secure, interoperable agency workflows that span multiple systems. The company is already deployed in classified Department of Defense environments and on platforms like NetSuite. sales force.
However, the same change also creates risks for model makers.
““Agents and workflows make some of the larger LLMs both powerful and less necessary,” he noted. “Without GPT-5, you can build reasoning agents with smaller, more specific workflows.”
Tools and agents built with leading LLMs have the potential to replace legacy software products from companies like Microsoft and Salesforce.
That’s why OpenAI is racing to build infrastructure around its models. This is not only to make them stronger, but also to make them harder to replace.
The real bet is not that the best model will win, but that the company with the most complete developer cycle will define the next platform era.
That’s what ChatGPT’s vision is now: Not just a chatbot, but an operating system for artificial intelligence.




