The AI industry is running into a wall it cannot code its way past: there is not enough electricity.
Balaji Srinivasan, the entrepreneur and investor, put it bluntly on X this week: "It's a global energy crisis." He is not prone to hyperbole, and the numbers bear him out. At least 16 gigawatts of data center capacity is planned globally for 2026, nearly triple what was built the previous year, according to Sightline Climate. But only 5 gigawatts is currently under construction. The rest is waiting on power that does not yet exist.
The consequences are immediate and measurable. Between 30 percent and 50 percent of large data centers scheduled to come online in 2026 are expected to be delayed due to power constraints, equipment shortages, and local opposition, Sightline Climate found in its latest Data Center Outlook, which tracked 777 projects larger than 50 megawatts announced since January 2024. Last year, 26 percent of planned data center projects were delayed. The grid was not built for this.
AI has rapidly reshaped the load profile of data centers. According to the International Energy Agency, AI has been responsible for roughly 5 to 15 percent of data-centre power use in recent years, a share the agency projects could reach 35 to 50 percent by 2030. That shift is landing at precisely the moment utilities are least able to absorb it. Data centers were set to account for almost half of the growth in U.S. electricity consumption between 2025 and 2030, according to the IEA. The IEA estimates global data center electricity consumption will reach 1,000 terawatt-hours by 2026, roughly equivalent to Japan's total annual electricity consumption. If it reaches the higher estimates of 1,050 terawatt-hours, the sector would rank fifth globally in electricity consumption, sitting between Japan and Russia.
Goldman Sachs forecasts AI will drive a 175 percent rise in global data center power demand by 2030 compared to 2023 levels, a projection that factors into a broader macroeconomic reckoning: the investment bank estimates data center power consumption will add 0.1 percentage points to core inflation in both 2026 and 2027. AI capital expenditures accounted for 39 percent of U.S. GDP growth in the first three quarters of last year, Goldman noted, compared to 28 percent during the dot-com boom.
The constraint is not abstract. The interconnection queue for new power connections in the United States stretches years long in many regions. Transmission lines take a decade to permitting and build. Natural gas peaker plants that were scheduled for retirement are being kept online. The physical infrastructure of the grid was designed for a different economy, and AI is compressing a once-in-a-generation rewrite of that infrastructure into a timeline that no utility planning cycle can accommodate.
Big tech knows this. Google has signed demand response contracts totaling 1 gigawatt with five U.S. utilities spanning from Arkansas to Minnesota, covering states including Arkansas, Michigan, and Tennessee. In a deal with DTE Energy in Michigan alone, Google structured 2.7 gigawatts of capacity: 1.6 gigawatts of solar, 400 megawatts of four-hour storage, 50 megawatts of long-duration storage, 300 megawatts of additional clean resources, and 350 megawatts of demand response. Google has also moved to acquire Intersect Power, a energy generation company, giving it direct ownership of generation assets rather than relying on the open market.
Microsoft has taken a different approach, canceling some leases and announcing smaller data centers than originally planned, while prioritizing grid stability in surrounding communities, including by restarting large nuclear power plants. Three Mile Island, shuttered since the 1979 accident, is being brought back to serve Microsoft's data center campuses.
The gap between announcement and construction is not a PR problem. It is a physical reality. You cannot will a transmission line into existence faster than the time it takes to build one. The demand is real. But the pipeline from planned to powered is years long in many cases, and the delays are compounding.
The AI industry spent the last two years obsessing over HBM memory shortages and GPU allocation. The binding constraint in 2026 is not a chip. It is an electron. And unlike HBM, you cannot route around a transmission line through a different supplier.