When it takes seven years to connect a data center to the grid, companies stop waiting for the grid.
That is the situation AI infrastructure has arrived at, according to new data from satellite imagery analysts and energy researchers. And the workaround that companies have chosen tells you more about the state of the AI buildout than any earnings call.
The delay numbers are real. Nearly 40 percent of US data center projects planned for 2026 are at risk of delay or cancellation, according to an analysis by SynMax shared with the Financial Times. Major projects tied to Microsoft, Oracle, and OpenAI are likely to miss their completion dates by more than three months, with some slipping into 2027 or beyond, per Ars Technica citing the FT reporting. Oracle's 1.4-gigawatt campus in Shackelford County, Texas, was supposed to deliver its first building in the second half of 2026. Satellite imagery shows land cleared for six of ten planned buildings, but only one showing actual construction activity as of early April, per Expansion citing FT reporting. The earliest realistic delivery date is December 2026; the median comparable project runs into late 2027. OpenAI's 1.2-gigawatt facility in Milam County, Texas, announced by Greg Brockman as recently as last month, has broken ground on exactly one of its planned installations. Of OpenAI's major Texas projects, only the Abilene facility is expected online this year.
The causes are compounding: a shortage of specialized tradespeople, grid connection queues stretching to seven years, permit backlogs, tariffs on imported transformers and switchgear. OpenAI is effectively competing with OpenAI, noted Doug O'Laughlin of SemiAnalysis. Workers move between projects for higher wages. For those who stay, even double shifts may not be enough to meet the timelines the companies announced.
But the more revealing story is what companies are doing about the power problem. Rather than wait for grid capacity, AI developers are building their own generation. Cleanview, a market intelligence firm that tracks data center construction, identified 46 data centers with a combined capacity of 56 gigawatts that plan to build their own power behind-the-meter. That represents roughly 30 percent of all planned US data center capacity. Ninety percent of those projects, representing about 50 gigawatts, were announced in 2025 alone.
The equipment choices reveal the desperation. About 75 percent of the generation equipment Cleanview could identify, roughly 23 gigawatts worth, runs on natural gas. But the specific technologies being deployed are not those of a planned energy transition. Data center developers are buying mobile gas generators strapped to semi-trucks, aeroderivative turbines originally designed for aircraft and warships, and reciprocating engines typically used in industrial operations. One developer, unable to source conventional turbines, placed a $1.25 billion order with Boom Supersonic, a company that has never sold a power generation product. xAI famously drove diesel-burning generator trucks directly onto its Memphis site to get power online without waiting for utility infrastructure.
These are not efficient choices. Mobile generators and aeroderivative turbines carry significant cost premiums per kilowatt-hour compared to utility-scale gas combined-cycle plants. But efficiency is not the operative variable. An AI data center can earn $10 to $12 billion per gigawatt, per Cleanview. Getting a facility online a few years early can mean the difference between capturing that revenue and missing a market window. Speed-to-power justifies premiums that would be irrational in any other context.
The 23 gigawatts of gas-fired behind-the-meter capacity also exists largely outside standard national energy forecasts. These are distributed installations, many below the threshold for mandatory emissions reporting, and most have received little public scrutiny. If this capacity comes online as projected, it represents a significant unplanned fossil fuel buildout that will not appear cleanly in any emissions accounting.
The construction delay data underscores how severe the bottleneck has become. The geographic concentration of projects is making bottlenecks worse. In Texas alone, multiple major campuses are competing for the same specialized trades and grid infrastructure simultaneously. Maine just became the first state to pass an 18-month moratorium on new data centers exceeding 20 megawatts of power demand, per Ars Technica. Communities from Virginia to New Jersey are pushing back against proposed facilities, citing grid stability and rate impacts.
The practical implication is straightforward: AI companies have decided the grid is not a viable path forward for their power needs at the required scale and timeline. They are building their own fossil-fuel-based power infrastructure, sourcing equipment from companies with no prior power generation track record, and treating the resulting cost premiums as a cost of staying in the race. The seven-year grid queue did not slow down AI ambitions. It redirected them.
The underlying economics may be sound in the narrow sense. If AI compute is valuable enough, paying a premium for private generation is rational even at $10 to $12 billion per gigawatt of annual revenue. But the workaround reveals something about constraints that capital cannot simply buy away. Transformers, turbines, and specialized labor are physical resources with real supply chains. The AI industry discovered that money does not make them appear faster.
The longer-term question, of who controls the power infrastructure that AI runs on, may matter more than whatever the data centers are computing.