Published on

Who Pays When AI Gets Thirsty for Power?

Authors

The Grid Is Not Ready

There is a number worth sitting with for a moment: 426 terawatt-hours. That is how much electricity U.S. data centers are projected to consume by 2030—a 133% increase from the 183 TWh they consumed in 2024, which itself already represented more than 4% of all electricity used in the United States. What we know about energy use at U.S. data centers amid the AI boom

That projection comes from the International Energy Agency's "Energy and AI" report (published April 2025), as summarized by Pew Research Center. A few caveats are worth stating plainly: the figure reflects the IEA's base case scenario, which assumes current industry forecasts and regulatory conditions hold. It covers all U.S. data center workloads—not AI alone—and does not assume dramatic efficiency breakthroughs or aggressive on-site generation. It is a projection, not a guarantee. But the directional force is not in serious dispute.

For context, 183 TWh is roughly equivalent to the annual electricity demand of the entire nation of Pakistan. By 2030, U.S. data centers are projected to consume more than twice that amount—across all workloads, from streaming video to AI model inference to enterprise cloud storage.

The American electrical grid was not designed for this. And the people who built it are starting to say so, loudly.


who pays when ai gets thirsty for power 1

A Market Growing Faster Than the Infrastructure That Feeds It

The global data center power market is on a clear upward trajectory, projected to grow from $35.14 billion in 2025 to $50.51 billion by 2030, a compound annual growth rate of 7.5%. Data Center Power Market Size, Statistics, Growth Analysis Demand for AI computing capacity is the primary engine of that growth, and the numbers reflect it: new tenants absorbed a record 2.5 gigawatts of data center capacity in 2025, up 38% from the year prior. 1

And yet, for the first time since 2020, the actual construction of new data centers fell. Capacity under construction dropped from 6.35 GW at the end of 2024 to 5.99 GW by the end of 2025. 1 The culprits are familiar to anyone who has watched infrastructure policy in this country: permitting delays, zoning conflicts, and a deepening inability to procure grid connections in time.

Vacancy rates in primary data center markets have fallen to a record low of 1.4%. The industry is not slowing down because demand softened. It is slowing down because the physical world—land, power lines, regulatory process—cannot keep pace with the spreadsheets.

This has reshuffled the geographic map of data center development in striking ways. Construction in northern Virginia, long the dominant hub, fell 29%. Meanwhile, Chicago saw a 169% increase in projects, and Atlanta now has more than 2 gigawatts under active construction. Markets with available land and accessible power have become the new prize. 1


The Grid Under Strain

The North American Electric Reliability Corporation does not traffic in hyperbole. It is a technical standards organization that publishes careful, methodical assessments. Which is precisely why the language in its 2025 Long-Term Reliability Assessment deserves attention.

NERC projects that summer peak electricity demand could surge by 224 GW through 2035—69% more than was projected just one year prior. Winter demand could rise even further, by 245 GW. The report notes that compound annual growth rates for both summer and winter peak demand are "the highest since NERC's tracking started in 1995." NERC Warns Long-Term Grid Reliability Risks Mounting from Surging Demand, Lagging Resources

Thirteen of 23 North American assessment areas now face elevated or high resource adequacy risks over the next five years. By 2030, regions including MISO, PJM, ERCOT, and parts of the Western Interconnection face the highest risk of supply shortfalls. These are not fringe markets—they are the backbone of American industrial and residential electricity.

The resource composition of the grid is shifting simultaneously, and not entirely in a reassuring direction. In a single recent year, fossil capacity dropped by 21 GW while battery, wind, and solar capacity increased by 23 GW. The net math looks acceptable until you consider that solar and wind are variable—they do not generate power on demand. A hyperscale AI data center requires 300 to 500 megawatts of continuous, uninterrupted electricity—comparable to the consumption of a mid-sized city, running around the clock, every day of the year. Why Microsoft And Amazon Are Turning To Nuclear Power For AI

Wind does not always blow. The servers never stop.

It is worth noting that the grid is not without other tools. Grid-scale long-duration storage, enhanced transmission buildouts, demand response programs, dispatchable renewables paired with firming capacity, hydrogen-based peaking, and even natural gas peakers with carbon capture are all part of the broader solution set being explored and, in some cases, deployed. None of these, individually or in combination, has yet demonstrated the ability to fully substitute for firm baseload generation at the scale and reliability that hyperscale AI infrastructure demands—but that assessment is contested, and the technology landscape is evolving. The gap is real; whether nuclear is the only way to close it is not.


The Nuclear Pivot

Microsoft and Amazon have arrived at the same conclusion through slightly different paths: if you need firm, continuous, carbon-free power at scale, nuclear is currently the most mature technology that provides it. Both companies are betting heavily on it. But it is worth being precise about what those bets entail—and what risks they carry.

Microsoft's involvement in restarting the former Three Mile Island Unit 1 reactor—now rebranded as the Crane Clean Energy Center—is one of the clearest signals of this shift. Constellation Energy secured a $1 billion Department of Energy loan in late 2025 to accelerate the restart, with commercial operation targeted for around 2027. That timeline is subject to regulatory approval, ongoing inspections, and the financing realities of nuclear project execution, which have historically run long and over budget. Microsoft has also signed a power purchase agreement tied to a planned fusion facility from Helion Energy—a longer-horizon bet on energy density at scale that remains years, if not decades, from commercial viability. 2

Amazon's approach emphasizes direct control. Its acquisition of the Cumulus Data Center campus from Talen Energy provides "behind-the-meter" access to electricity generated by the Susquehanna nuclear facility, allowing the company to sidestep some of the transmission bottlenecks that increasingly delay new data center development. Amazon is also investing in small modular reactors through partnerships involving Energy Northwest and X-energy—a technology that has shown promise but has yet to be deployed at commercial scale in the United States, and faces permitting and construction timelines that stretch well into the 2030s. 2

The strategic logic is straightforward: as grid connection queues stretch into multi-year delays, proximity to a dedicated power source is not a luxury—it is a competitive advantage. Nuclear, for all its complexity, offers something wind and solar cannot guarantee without extensive storage: power on demand, regardless of weather, at the scale AI infrastructure requires. But "most mature" does not mean "without risk." Restarts face regulatory scrutiny. SMRs remain unproven at commercial scale in the U.S. Fusion is a long-term moonshot. The companies pursuing these paths are making calculated bets, not plugging in solved technology.

Energy costs already represent 35% of data center operating expenditures, up from 30% in 2020. Data Center Power Consumption Statistics: Market Data Report 2026 For hyperscale facilities, the efficiency gap matters enormously. The average hyperscale data center achieves a Power Usage Effectiveness (PUE) of 1.24, compared to a global average of 1.45—a gap that translates to millions of dollars annually at scale. As one illustration of the stakes: according to industry estimates, a 0.1 reduction in PUE for a 50 MW data center can save approximately $1.15 million per year in energy costs. That figure assumes a U.S. average commercial electricity rate of roughly $0.065 per kWh and treats the full 50 MW as IT load—meaning actual savings at any given facility will vary based on local electricity prices, IT load definitions, and cooling architecture. The directional point holds: efficiency is not a rounding error. It is a competitive weapon.

Note: The WorldMetrics and MarketsandMarkets figures cited in this article are drawn from commercial market research reports. Their projections are directionally useful but carry the usual caveats of proprietary methodology and limited independent verification.


The Political Reckoning

None of this is happening in a vacuum, and elected officials have noticed.

Electric and gas utilities requested $31 billion in rate increases from state regulators in 2024—more than double the $15 billion requested the year before—with many utilities explicitly citing data center power demand as the primary driver. Anthropic says it will pay 100% of the grid upgrade costs tied to its AI data centers The question of who absorbs those costs has become a live political issue at every level of government.

Last month, President Trump wrote on Truth Social: "I never want Americans to pay higher Electricity bills because of Data Centers," adding that "big technology companies who build them must pay their own way." Big Tech companies were subsequently scheduled to meet with the administration at the White House to sign a pledge on data center power costs—though as of this writing (February 25, 2026), the details and binding nature of any such pledge remain unclear and no formal agreement has been publicly confirmed. Pennsylvania Governor Shapiro has staked out a similar position, publicly demanding that data centers "pay for your own power." Earlier this month, Illinois Governor Pritzker moved to temporarily halt data center incentives to contain soaring power costs. Virginia legislators have proposed shifting more energy costs directly onto data centers to reduce residential rates.

Local communities are shifting as well. The tide has turned in many markets from welcoming the economic benefits of major construction projects to scrutinizing their resource-intensiveness. Tensions have flared in northern Virginia. An Oracle site in New Mexico has prompted protests over environmental impact. The social contract between data center developers and the communities that host them is being renegotiated in real time.

Some technology companies are getting ahead of it. Anthropic announced it will pay 100% of the grid upgrade costs tied to its data centers—specifically, the interconnection and infrastructure upgrade costs associated with its planned facilities in Texas and New York, absorbing costs that might otherwise be passed on to American households. This is a meaningful commitment for those specific projects; it is not a blanket industry-wide policy change. Microsoft introduced similar measures last month, committing to pay utility rates sufficient to cover the full cost of its data center electricity use. These announcements suggest the industry is beginning to understand that the old model—where infrastructure costs were quietly socialized onto ratepayers—is no longer politically viable.


who pays when ai gets thirsty for power 2

What Comes Next

The federal government has simultaneously identified data center development as a national priority for AI competitiveness and national security. The tension between that imperative and the imperative to protect residential electricity bills is not easily resolved—and it will define energy and technology policy for the better part of the next decade.

The data center industry is not going to slow down. The demand is real, the investment is committed, and the geopolitical stakes of the AI race are not going away. But electricity supply has crossed a threshold. It is no longer merely an operating expense to be managed. It is a strategic bottleneck—one that will determine which companies scale, which regions develop, and ultimately, which country leads in artificial intelligence.

The grid was built for a different world. The question now is whether we can rebuild it fast enough for the one we are creating.

Footnotes

  1. Data center construction fell for the first time in years as permits and power constrain growth 2 3

  2. Why Microsoft And Amazon Are Turning To Nuclear Power For AI 2