
For OpenAI their compute needs are no longer incidental, they are existential.
OpenAI makes a multibillion-dollar bet on bespoke silicon as it doubles down on AMD and edges out Nvidia reliance.
Let’s be blunt: artificial intelligence (AI Artificial Intelligence (AI) Artificial Intelligence (AI) is a term coined by in 1956, which defines the automation of robotics to the actual process of robotics.The evolution of technology has since led to the gradual adoption of AI in several aspects of our lives. One of the most pertinent is its impact in the financial services industry, which provides a wide range of possibilities moving forward.Ways AI Can Transform FinanceAI has the potential to transform the financial services industry forever. This can take shape in Artificial Intelligence (AI) is a term coined by in 1956, which defines the automation of robotics to the actual process of robotics.The evolution of technology has since led to the gradual adoption of AI in several aspects of our lives. One of the most pertinent is its impact in the financial services industry, which provides a wide range of possibilities moving forward.Ways AI Can Transform FinanceAI has the potential to transform the financial services industry forever. This can take shape in Read this Term) is an energy guzzler. Running large models — especially inference at scale and real-time interaction, means billions of flops, memory bandwidth, and networking. As AI systems proliferate, the power draw is no longer an afterthought. In announcing its 10 GW deal with Broadcom, OpenAI said the deployment would begin in the second half of 2026 and finish by 2029.
Ten gigawatts is not trivial. To put it in perspective, 10 GW would be enough to power more than 8 million U.S. households.
Why go custom? Because dependence on general-purpose AI accelerators (hello, Nvidia) means you’re subject to supply, margin, and roadmap constraints. Building your own gear (or co-designing) lets you tailor the stack: chip, memory, interconnects, software. In the Broadcom deal, OpenAI will design the accelerators; Broadcom will build and deploy them.
One more twist: Broadcom’s networking tech (Ethernet, etc.) is intended to be integrated with this stack. This is, perhaps, an opportuninty for OpenAI and Broadcom to displace Nvidia’s InfiniBand technology.
OpenAI isn’t putting all its eggs in one chip basket. In early October 2025, it struck a multi-year deal with AMD to deploy 6 GW of Instinct GPUs over several generations. The first tranche, 1 GW, will begin deploying in the second half of 2026.
That AMD arrangement includes an interesting wrinkle: AMD issued OpenAI warrants to acquire up to 160 million shares (about 10 %) at a nominal price, vesting as deployment and share-price milestones are met.
Taken together, those agreements (Broadcom and AMD) suggest OpenAI is diversifying its compute partnerships while retaining leverage in its stack. It’s not abandoning Nvidia (which recently pledged 10 GW of systems), but it is signaling it wants more control.
If the math holds, OpenAI could control or influence some 16 GW of compute across custom accelerators, AMD GPUs, and Nvidia systems (plus third-party cloud Cloud The cloud or cloud computing helps provides data and applications that can be accessed from nearly any location in the world so long as a stable Internet connection exists. Categorized into three cloud services, cloud computing is segmented into Software as a Service (SaaS), Infrastructure as a Service (IaaS), and Platform as a Service (PaaS). In terms of trading, the versatility of the cloud service allows retail traders the ability to test out new trading strategies, backtest pre-existing conc The cloud or cloud computing helps provides data and applications that can be accessed from nearly any location in the world so long as a stable Internet connection exists. Categorized into three cloud services, cloud computing is segmented into Software as a Service (SaaS), Infrastructure as a Service (IaaS), and Platform as a Service (PaaS). In terms of trading, the versatility of the cloud service allows retail traders the ability to test out new trading strategies, backtest pre-existing conc Read this Term or collaborative deals). That level of scale is not just ambitious, it’s borderline industrial.
This isn’t a vanity project. AI compute is on a Moore’s-Law-lite treadmill: the more models, the deeper the memory, the fatter the activation traffic, the bigger the cluster. The connections between compute and energy are multiplying.
Yet risks abound. Designing a chip is one thing. Executing yield, software stack maturity, cooling/infrastructure, supply chain (memory, packaging), and the ramp from prototype to volume are where dreams often die. Just ask Intel.
Also, the financing is staggering. Even at $50B-$60B per GW (a benchmark that’s often cited when talking about AI infrastructure) the Broadcom component can easily run into the hundreds of billions. OpenAI’s revenue is orders of magnitude smaller today. That implies heavy leverage, pre-commitments, and bet-the-future construction. Analysts have warned of the mismatch between OpenAI’s spending commitments and its current cash flow. “What’s real about this announcement is OpenAI’s intention of having its own custom chips,” said analyst Gil Luria, head of technology research at D.A. Davidson. “The rest is fantastical. OpenAI has made, at this point, approaching $1 trillion of commitments, and it’s a company that only has $15 billion of revenue,” Gil Luria, head of technology research at D.A. Davidson said to AP in an interview.
Still, for those who say “this is too much,” remember: AI is now as much about infrastructure as algorithms. The winners will be those who master both.
The Broadcom deal is audacious. The AMD deal is clever. Combined, they’re a bold wager: that compute is no longer a cost center, it’s the central battlefield of AI.
If these deals succeed, OpenAI might well emerge not just as a model house but a compute juggernaut. If they fail (in yield or funding), it could crowd out liquidity and distract attention from model advances. Either way, AI infrastructure just got a lot more interesting.

