OpenAI has agreed to spend more than 20 billion dollars on Cerebras chips over the next three years and could receive equity warrants worth up to 10 percent of a 35 billion dollar IPO in return, according to Reuters. The deal, confirmed by Reuters and first reported by The Information, is notable on its face: a massive procurement commitment from the world s most prominent AI lab to a specialized chipmaker racing toward a public listing. But the detail that puts the story in a different category is one Reuters flagged but no other outlet has yet connected: Sam Altman, OpenAI s CEO, was an early investor in Cerebras before he ever ran OpenAI, Reuters reported in January.
That matters. Altman controls a procurement decision that could ultimately result in OpenAI holding a meaningful stake in the company he helped fund as a private investor. The deal structure, spending commitments plus equity warrants that vest as OpenAI s payments increase, blurs the line between customer and shareholder in a way that has not been publicly acknowledged or explained by either party. OpenAI and Cerebras declined to comment for this story.
The conflict is not hypothetical. Emails that emerged in litigation between Altman and Elon Musk show OpenAI evaluating Cerebras technology as early as 2017, when Altman was not yet OpenAI s CEO, CNBC reported. The current deal trajectory, with Cerebras targeting a Q2 2026 IPO at a 35 billion dollar valuation and a concurrent 3 billion dollar fundraise, means OpenAI could simultaneously be Cerebras s largest customer and a minority owner at listing.
The broader context makes the relationship stranger. In September 2025, NVIDIA signed a deal to sell up to 10 gigawatts of chips to OpenAI, a commitment worth roughly 100 billion dollars at current pricing, Silicon Republic reported. Months later, that deal has not been finalized, and OpenAI was already in advanced talks with Cerebras. The inference story matters here: internal OpenAI teams attributed some performance limitations in Codex, the company s code-generation product, to NVIDIA GPU hardware, TrendForce reported. OpenAI is publicly committed to NVIDIA for training but appears to be actively diversifying its inference compute, the segment that handles real-world queries from its more than 800 million weekly users, Network World noted.
Cerebras s advantage is architectural. The company s Wafer Scale Engine-3 contains 900,000 AI-optimized cores on a single silicon wafer, 19 times more transistors and 28 times more compute than NVIDIA s B200 Blackwell GPU, Cerebras says. CEO Andrew Feldman has said reasoning tasks that take minutes on NVIDIA GPUs take a single second on Cerebras hardware. The company calls the current engagement the largest high-speed AI inference deployment in the world.
There are reasons to be cautious about the financial figures. Reuters reported it could not independently verify the 20 billion dollar commitment or the 10 percent warrant ceiling cited by The Information. The sources who described the warrant structure did so on condition of anonymity. What is confirmed is that the January 2026 deal, 10 billion dollars for 750 megawatts of capacity over three years, was real, and the current deal appears to be an expansion of that commitment rather than a replacement. The 1 billion dollars OpenAI agreed to provide for data center development is also documented.
The story s significance is not that a company bought chips. It is that the buyer and the seller have a pre-existing financial relationship that the current procurement decision sits inside, and nobody has explained how that relationship was disclosed to OpenAI s board or whether it required any special governance process. NVIDIA s position is not the real story here, the conflict of interest at the center of the deal is.