Google holds more AI computing power than any other company on earth, and it got there in a way nobody else has managed: by building its own chips.
New data from Epoch AI, published in Q4 2025, shows that Google operates the equivalent of roughly 5 million Nvidia H100 GPUs in compute capacity — about 25 percent of the total held by the five major U.S. hyperscalers. That makes it the largest single owner of AI computing power in the world. The more interesting number is the one that explains why: roughly 75 percent of Google's stockpile runs on its own custom silicon, the Tensor Processing Units it has designed in-house since 2016. Every other hyperscaler — Microsoft, Amazon, Meta — relies predominantly on Nvidia chips.
The Epoch AI dataset tracks AI compute ownership by attributing chips to the entity that owns them, not the entity that uses them. That distinction matters. OpenAI's training runs happen on Microsoft Azure. Anthropic's models run on Amazon Web Services and on Google's TPU clusters. Under Epoch's methodology, those chips count as Microsoft, Amazon, and Google compute — not as OpenAI or Anthropic compute. The labs are tenants. The hyperscalers own the infrastructure.
For Google, that infrastructure is unusually self-sufficient. Custom ASICs like TPUs give Google a different kind of industrial control over its compute fleet. Google owns the TPU instruction set architecture, the microarchitecture, and the software stack that programs the chips. Broadcom handles the physical chip design and manufacturing layout as a partner, but Google retains the intellectual property that defines what the hardware is and what it can do. That is not true of any other hyperscaler's primary compute supplier — their relationship with Nvidia is a customer-vendor link, not an ownership one.
The Anthropic arrangement sharpens the tension. Anthropic, the AI safety company behind the Claude model family, runs a substantial portion of its workloads on Google TPU clusters. Google is simultaneously the largest AI compute owner in the world and a critical infrastructure provider to one of the most prominent AI labs — a company that competes with Google's own Gemini models. If Anthropic's models become more capable or more trusted than Google's, the business logic of the arrangement becomes complicated. Google owns the chips that power its competitor's rise.
This is not a secret arrangement. Both companies have disclosed it. But the Epoch data makes the scale of it legible in a way that a disclosure paragraph buried in a filing does not. When a company controls a quarter of the world's AI compute and also hosts the models that compete with it, the question of what "neutral infrastructure provider" actually means is not abstract. It is a product strategy with a direct conflict baked in.
The five U.S. hyperscalers — Google, Microsoft, Amazon, Meta, and Apple — collectively hold over 60 percent of the world's AI compute, according to Epoch's estimates. That concentration has drawn increasing attention from policymakers concerned about supply chain fragility. The CHIPS Act was partly a response to this kind of concentration. A world where the most important AI compute is owned by a handful of American companies, and where the most advanced AI labs depend on renting access to that compute rather than owning it, is a world with a specific set of power dynamics that the current regulatory conversation has only begun to map.
Google's TPU dependence creates a partial exception to those concerns. A company that manufactures its own chips is not subject to the same export control vulnerabilities as one that procures from a single foreign-dependent supplier. Nvidia cannot cut off Google's access to H100s — Google does not need H100s. But it also cannot sell those chips to others. Google's silicon independence is a moat; it is also a ceiling on how much of the world's AI infrastructure it can credibly claim to democratize.
For founders and engineers watching the infrastructure layer, the Epoch data is a reminder that the AI compute market is not a commodity market. The chips are not interchangeable. Who owns them, how they were built, and what relationships tie the owners to the labs that depend on them are questions that do not show up in benchmark scores but show up everywhere in the negotiating dynamics that actually shape the industry.
What to watch next: the next round of Epoch data, expected in mid-2026, will show whether Google's TPU fleet continues to scale faster than its hyperscaler peers, and whether the Anthropic arrangement expands or contracts as both companies push toward more capable models. The structural question — whether the largest AI compute owner can also be a credible neutral platform — will not resolve itself. It will be decided by product choices, regulatory pressure, and the pace at which the industry's dependency on rented infrastructure becomes untenable for the labs that built it.
The Epoch AI data hub is at epoch.ai/data/ai-chip-owners. The methodology is explained at epoch.ai/blog/introducing-the-ai-chip-owners-explorer.