The Inconvenient Physics of Artificial Intelligence
Training a single large language model like GPT-4 consumed an estimated 50 gigawatt-hours of electricity — roughly equivalent to the annual energy consumption of 4,600 American homes. That figure was from 2023. The models deployed in 2025 and 2026 are substantially larger, trained on substantially more data, with substantially higher compute requirements. And training is only part of the equation: inference — the process of actually running a deployed model to answer a query — runs continuously, at scale, across millions of requests per hour.
Goldman Sachs estimated in 2024 that AI data centres could account for 8% of US electricity demand by 2030, up from roughly 3% in 2023. The International Energy Agency projected that data centres globally would consume around 1,000 terawatt-hours of electricity in 2026 — comparable to Japan's entire national electricity consumption. The numbers keep moving upward because the computational demands of frontier AI are growing faster than the industry's ability to improve efficiency.
This creates a physical problem that no amount of software optimisation entirely resolves. Somewhere, electrons must flow through wires to power the chips. Those electrons must come from somewhere. And the "somewhere" is increasingly inadequate.
The Grid Cannot Keep Up
America's electricity grid was built for a world of steady, predictable demand — factories, air conditioning, lighting, refrigeration. Adding data centres that draw hundreds of megawatts continuously, 24 hours a day, 365 days a year, stresses a system designed for a different era. Unlike most industrial loads, a data centre cannot reduce consumption when the grid is strained. The servers must run. The cooling systems must run. The network equipment must run. Any interruption is measured in millions of dollars of lost revenue and violated service-level agreements.
The practical consequence is that the largest cloud providers — Amazon Web Services, Microsoft Azure, and Google Cloud — are finding it increasingly difficult to build new hyperscale data centres in the United States because local utilities cannot guarantee sufficient power supply on the timelines the companies require. PJM Interconnection, which manages the grid across 13 states including much of the American Mid-Atlantic and Midwest, reported in 2024 that its interconnection queue — the backlog of projects seeking to connect new generation capacity to the grid — had grown to over 3,000 projects representing more than 300 gigawatts of requested capacity. The wait time for a new connection had stretched to 5–7 years.
Five to seven years is not a viable timeline for a technology sector operating on 18-month product cycles.
| Company | Deal | Capacity / Value | Year |
|---|---|---|---|
| Microsoft | 20-year PPA — restart of Three Mile Island Unit 1 (Constellation Energy) | 835 MW | 2023 |
| Amazon (AWS) | Acquisition of Talen Energy nuclear-powered data campus (Pennsylvania) | $650M / 960 MW campus | 2023 |
| Agreement with Kairos Power for small modular reactors | 500 MW (6–7 SMRs) | 2024 | |
| Microsoft | Investment in Helion Energy (fusion, conditional on commercial operation) | $678M+ | 2023 |
| Amazon | Investment in X-energy for SMR deployment for AWS | $500M+ | 2023 |
Three Mile Island: The Symbol and the Reality
In September 2023, Microsoft announced a 20-year power purchase agreement with Constellation Energy to restart Unit 1 of the Three Mile Island nuclear plant in Pennsylvania — the reactor that was not involved in the 1979 accident but that had been shut down in 2019 on economic grounds when cheap natural gas had made its operating economics unviable. The plant was restarted in September 2024, its 835 megawatts of zero-carbon baseload power contracted entirely to Microsoft.
The symbolism was intentional and overwhelming. Three Mile Island is the most recognisable name in American nuclear history, synonymous in public consciousness with the near-catastrophe of 1979. Restarting it — not because the government mandated it, not because environmentalists demanded it, but because a corporation needed reliable zero-carbon power — represented something genuinely new in the politics of energy: market demand pulling nuclear back from the graveyard that regulatory costs and cheap gas had dug for it.
The economics were similarly clarifying. A 20-year power purchase agreement at an undisclosed but presumably above-market rate made the economics of restarting a paid-off nuclear plant viable. Without a committed, creditworthy buyer for every megawatt-hour the plant could produce, Constellation could not justify the restart costs. With Microsoft's contract, the restart was financially straightforward. This is the template: tech company as anchor tenant, nuclear plant as dedicated power source, grid as backup rather than primary supply.
Why Renewables Cannot Solve This
The obvious question is why Big Tech's energy problem cannot be solved by building more solar and wind, consistent with the carbon commitments these companies have publicly made. The answer is physics and economics, not politics.
Solar panels generate electricity when the sun shines. Wind turbines generate electricity when the wind blows. A hyperscale data centre requires electricity continuously, at a predictable voltage and frequency, regardless of weather conditions. The solution to renewable intermittency is storage — batteries that absorb excess generation and release it when the sun isn't shining or the wind isn't blowing. Current battery storage technology can bridge gaps of four to eight hours at commercially viable costs. It cannot bridge a cloudy, windless week.
The data centre constraint makes this worse, not better. A Google data centre in Virginia draws approximately 100 megawatts continuously. To backstop that load through a 72-hour low-generation period using lithium-ion battery storage would require approximately 7.2 gigawatt-hours of installed storage capacity. At current battery costs of roughly $300–400 per kilowatt-hour installed, that is $2.2–2.9 billion in storage alone — for a single data centre's backup for three days.
Nuclear solves the problem that renewables cannot: it produces power continuously, at high density, on a small land footprint, at a predictable cost, in any weather. A single 1,000-megawatt nuclear plant occupies roughly one square mile. To produce the equivalent generation from solar would require approximately 75 square miles of panels. For a company that needs to build power generation near its data centres, often in suburban or semi-urban environments, nuclear's land efficiency is as important as its reliability.
The Small Modular Reactor Bet
The more speculative part of Big Tech's nuclear strategy involves small modular reactors — SMRs — which are nuclear plants designed to be factory-built and deployed at a fraction of the cost and construction time of conventional nuclear. The theory is appealing: instead of a $20 billion, decade-long mega-project to build a large reactor, you order a 50–300 megawatt SMR from a factory, ship it to your site, and install it like industrial equipment.
Google's deal with Kairos Power — signed in October 2024 — was the first corporate power purchase agreement for SMRs in history. The agreement commits Google to purchasing electricity from six or seven Kairos reactors beginning in 2030, totalling approximately 500 megawatts. Kairos's reactor design uses molten fluoride salt as a coolant rather than pressurised water, operating at lower pressures and higher temperatures, with physical properties that theoretically prevent the kind of runaway reaction that destroyed Chernobyl and damaged Fukushima.
The sceptical case is straightforward: SMRs have been "five to ten years away" from commercial deployment for roughly 25 years. The regulatory hurdles in the United States, United Kingdom, and Europe remain substantial. NuScale Power, the first company to receive NRC design approval for an SMR, cancelled its flagship project in 2023 after projected costs nearly tripled during development. The technology works in principle and has worked in naval applications for decades. Whether it can be deployed commercially at scale, on schedule, at projected cost, is an open empirical question.
The Geopolitical Dimension
The energy stakes are not purely domestic. The country that successfully commercialises nuclear power for AI infrastructure — whether through large-scale plant restarts, SMR deployment, or eventually fusion — will have a structural advantage in the global AI race. AI computation requires energy. Energy supply determines the scale at which AI can be trained and deployed. A country with abundant, reliable, affordable electricity can run more computation per dollar than a country reliant on expensive, intermittent, or constrained power supply.
China understands this. The country currently has 26 nuclear reactors under construction — more than the rest of the world combined. China's national energy strategy explicitly frames nuclear as infrastructure for AI dominance, not just carbon reduction. The State Grid Corporation of China has been directed to prioritise power supply to AI data centres as a matter of national strategic priority.
The United States, by contrast, has not completed a new nuclear plant since 2023 (Vogtle Unit 4 in Georgia, which came in roughly seven years late and $17 billion over budget). The regulatory environment for nuclear in the US remains adversarial, shaped by a post-1979 political culture that treats every proposed nuclear project with presumptive hostility. Big Tech's nuclear deals are happening despite this regulatory environment, not because of a policy shift that has made it easier. The permitting reform that would allow America to build nuclear at Chinese timelines and costs does not yet exist.
This connects directly to what the agentic AI shift means for competitive advantage: the companies and nations that control the energy infrastructure for AI will have a structural advantage that compounds over time. Energy is the constraint that turns AI capability into economic and geopolitical power. The nuclear gamble is not about electricity. It is about who gets to run the next decade of civilisation-shaping computation.
What Gets Disrupted
The secondary effects of Big Tech's nuclear push extend beyond energy markets. The electricity utilities that have historically served as regulated monopolies in their regions are finding their largest potential customers building captive generation capacity rather than buying from the grid. This hollows out the revenue base that subsidises the grid infrastructure on which everyone else depends.
The natural gas industry — which expanded dramatically over the past two decades to supply electricity generation — is watching its most creditworthy customers pivot to nuclear. Goldman Sachs estimates that every gigawatt of new nuclear capacity commissioned for AI data centres displaces approximately 700 megawatts of natural gas peaking capacity. At scale, this is a material threat to the business model of LNG export terminals and pipeline companies that built capacity in anticipation of continued data centre demand growth.
For ordinary electricity customers — households and businesses without the purchasing power to buy their own reactor — the implications are mixed. Large-scale nuclear investment increases total supply, which in a functional electricity market would reduce prices. But if the new nuclear capacity is contracted entirely to Big Tech, bypassing the public grid, the supply addition does not benefit public customers. The grid gets the capital cost of transmission and distribution infrastructure without the revenue of a large industrial customer. These costs are socialised. The benefits are privatised. The pattern is familiar.
The Stand
The nuclear revival being driven by AI energy demand is one of the few genuinely positive structural developments in Western energy policy in two decades — not because of enlightened planning, but despite its absence. A market signal that was 70 years in the making — "reliable baseload zero-carbon power has value proportionate to its reliability" — is finally strong enough to pull private capital into nuclear at scale.
The irony is that the environmental movement spent 40 years killing nuclear on safety grounds, inadvertently preserving dependence on fossil fuels that killed millions more people through air pollution than nuclear ever has. Now the corporations building the AI infrastructure that same movement would likely critique for its data exploitation and energy consumption are the ones reviving the technology that could have decarbonised electricity grids two generations ago.
Markets are not moral. They are directional. The direction they are pointing at in energy is toward nuclear, driven by a demand signal that no amount of ideological opposition is capable of neutralising. Whether Western governments adapt their regulatory frameworks fast enough to capture that signal, or watch it flow to jurisdictions with fewer constraints, is the energy policy question of the next decade.