For more than 35 years, Arm did one thing and did it extremely well: it licensed its chip designs to everyone who wanted it. Nvidia built on Arm IP. Apple built on Arm IP. Qualcomm, Amazon, Google, Microsoft — all licensees. Arm was the IP company, not the silicon company.
That ended today.
Arm unveiled the Arm AGI CPU, the first chip the company has ever produced itself. The announcement, made at the company's newsroom, marks a fundamental shift in Arm's business model and a significant move into the AI datacenter infrastructure market.
Meta is the lead partner and co-developer. The social media company will be the first wide-scale deployer of the Arm AGI CPU in its AI data centers, with plans for multiple generations of the processor. For Arm, that is a significant win — Meta's infrastructure scale gives the chip a proving ground at real production loads.
The partner list beyond Meta is a snapshot of who is trying to diversify away from pure Nvidia dependency: Cerebras, Cloudflare, F5, OpenAI, Positron, Rebellions, SAP, and SK Telecom. Mohamed Awad, Arm's head of cloud AI, told CNBC the chip is designed for companies that cannot afford to build their own in-house processor — a direct description of the non-hyperscaler segment of the market Arm is now targeting.
The chip itself is built on the Neoverse platform that already underpins AWS Graviton, Google Axion, Microsoft Azure Cobalt, and Nvidia Vera. Arm's reference configuration packs 272 cores per blade across 30 blades in a standard air-cooled 36kW rack, totaling 8,160 cores. A liquid-cooled Supermicro design scales to 336 Arm AGI CPUs and over 45,000 cores at 200kW. Arm claims more than twice the performance per rack compared to current x86 systems, per Arm's own testing,* with the efficiency advantage rooted in memory bandwidth architecture that does not degrade under sustained parallel load the way x86 does.
Qualcomm's absence from the list is notable. The company secured what it called a "complete victory" over Arm in court last fall, winning a ruling on licensing agreement terms. No congratulatory note from Qualcomm appeared in Arm's announcement.
The strategic logic Arm laid out is straightforward: as AI agents spawn more agents and workloads grow in distributed complexity, the CPU becomes the orchestrator of the entire system — managing memory, scheduling workloads, coordinating fan-out across thousands of cores. That is a different role than the CPU held in traditional cloud infrastructure, and Arm is arguing it requires purpose-built silicon rather than adapted legacy architectures.
Arm did not disclose financial terms of the Meta partnership or chip volumes. The first deployments are expected later this year.
At press time, Arm's announcement footnote on the performance claim read: "Arm internal estimates based on Arm Neoverse N2 vs. competitive x86-based server CPU performance, normalized per rack. Results may vary."