Terry Harvill is the managing director of Laclede Economics and August Ankum is a founding partner and chief economist at QSI Consulting.
The rapid expansion of hyperscale data centers is fundamentally transforming the electricity sector. Driven by surging demand from artificial intelligence, cloud computing and digital services, these facilities are seeking grid access at unprecedented scales and speeds. Often requiring hundreds of megawatts, these loads are not merely consuming electricity. They are reshaping infrastructure, raising reliability concerns and triggering transmission and capacity investments that ripple across the system.
Recent research reinforces this urgency. A comprehensive May 2024 study by the Brattle Group concluded that electricity demand growth today is higher than at any point in the last two decades, driven largely by the explosive rise of data centers, industrial electrification, transportation electrification and cryptocurrency mining. Brattle’s analysis warns that existing planning and forecasting methods are no longer sufficient to manage the scale, speed and policy-driven nature of these new loads.
The challenge is not whether to accommodate this growth, but how to do so without distorting electricity markets or forcing other customers to subsidize private infrastructure. In a system already strained by resource constraints and planning delays, adding large, concentrated loads without clear price signals risks driving costly misallocations and undermining the very markets that support long-term reliability and investment.
Recent projections underscore the scale and immediacy of the challenge. PJM, for example, expects summer peak load to grow by more than 70,000 MW over the next 15 years, a shift largely driven by data center growth in areas like eastern Pennsylvania and Northern Virginia. Load growth on this scale is no longer theoretical. It is happening now, and electricity markets must adapt starting with a more disciplined and transparent approach to how grid access is priced and allocated.
Institutional responses have been inadequate
Historically, load interconnection was handled discreetly between large customers and their local utilities. But the size and concentration of today’s hyperscale loads, often co-located in clusters and dependent on transmission-scale delivery, demand broader scrutiny. Regional transmission organizations and regulators are being pulled into these discussions, especially when grid reliability and cost allocation are at stake.
In one prominent case, the Federal Energy Regulatory Commission recently rejected an amended interconnection service agreement involving PJM, Susquehanna Nuclear and a co-located Amazon data center. The amendment sought to increase load at the nuclear facility by nearly 200 MW. FERC’s decision reflected concern that PJM had not adequately justified deviations from its standard agreement or addressed broader implications for system reliability and fairness. The ruling highlights the growing complexity of integrating large, transmission-scale loads and the inability of current frameworks to handle them.
These challenges are real. But the response so far has leaned on administrative discretion and case-by-case patchwork. This mirrors the exact kind of market distortion that Harvard Professor William Hogan warned against for decades: when scarce resources are allocated by rules and negotiations rather than prices.
Grid capacity is not free
The transmission grid is finite and valuable. It is limited by geography, timing and technical constraints. If a 500-MW data center requests interconnection in a congested zone, it does not merely tap into excess supply. It displaces other users and drives the need for new infrastructure. That demand imposes a real opportunity cost.
When we do not price that cost, we distort behavior. Data centers may choose locations based on tax breaks, land prices, or fiber access rather than grid capacity. The result is misallocation. Loads concentrate in already-stressed areas. Upgrade costs rise. And residential or small commercial customers are forced to subsidize private industrial development through higher shared transmission rates.
This is economically indefensible. It violates the principle of cost causality that underpins utility regulation and market design. More importantly, it dulls the investment signals that should guide where and how the system expands. If we want grid investments to flow where they are needed most, we need price signals that reflect scarcity.
Pricing the scarcity
The most efficient solution is one based on marginal cost, pricing access in line with the actual impact each project has on the grid. In areas of the grid where there is sufficient headroom and no risk of congestion, access can and should be inexpensive or free. When grid upgrades are required to accommodate a new load, the costs of those upgrades should be paid by the entity that caused them. And in areas where capacity is scarce and highly valued, access should be allocated through market-based mechanisms that reflect what users are willing to pay.
This pricing structure could take many forms: auction-based access rights, tradable interconnection positions, location-specific pricing, or forward capacity markets for transmission. The form may vary by region, but the function must be the same: to reveal scarcity, reward efficient siting, and ensure those driving new demand also fund the infrastructure needed to serve it.
This is not about blocking data centers. It is about enabling them to grow efficiently, without placing the financial burden of their development on the broader public. Hyperscale loads are welcome, but they should pay the full cost of the capacity they consume. If they choose to locate in constrained areas, they should cover the incremental infrastructure costs. If they want lower-cost access, they should consider siting in areas where transmission capacity already exists or investing in on-site or behind-the-meter resources to reduce their impact.
Some argue these costs should be socialized in the name of job creation, local tax revenues, or innovation. But every infrastructure investment has both benefits and costs. The electricity system should not function as a hidden subsidy mechanism for large, well-capitalized private developers. If costs are socialized while profits remain private, the result is an unjust and economically inefficient allocation of resources.
Evidence of this cost-shifting risk is mounting. Harvard Law School’s Electricity Law Initiative has raised concerns about utilities passing infrastructure costs related to data centers onto general ratepayers. In states like Texas and Ohio, legislators and regulators are considering policies to prevent large loads from shifting the costs of grid upgrades onto other customers. The fact that such measures are necessary underscores the concern: without clear rules and transparent pricing, the costs of accommodating hyperscale load risk being unfairly spread across everyone else.
Avoiding the trap of administrative discretion
Today’s interconnection processes were not designed for this scale of load growth. They were built for generators and rely on queuing logic and engineering studies to manage access. These processes do not reflect the economic value of grid capacity. As a result, access to the system is being rationed by order in line, not by the value a project creates or its willingness to fund upgrades.
This creates hidden inefficiencies. Higher-value projects can be stuck behind lower-value ones. Load that would be willing to pay for access is sidelined by procedural rules. And political discretion creeps into the process as policymakers and utilities decide who gets expedited treatment and who waits. These are not market outcomes. They are administrative workarounds to a broken allocation model.
The path forward
Grid access must be treated like energy and capacity: as a scarce, priced product. That means building markets for interconnection capacity, moving away from socialized costs and reinforcing the principle that those who impose system costs must be the ones to bear them.
The foundational market structures already exist. Energy is priced at the margin. Congestion is hedged through financial transmission rights. Capacity is auctioned. What remains is to extend that same logic to the physical act of connecting new demand to the system.
Hyperscale data center growth will not slow down. Nor should it. But if data centers are to grow at scale, they must operate within a framework that respects the economic and physical limits of the grid. The electricity system does not need new barriers. It needs clear, fair and economically disciplined rules.
Markets work. But only when prices reflect scarcity. If we ignore that truth in the name of expediency or growth, we will misallocate billions in infrastructure investment, erode public trust and compromise the long-term reliability and fairness of the system.
Getting it right is not just about powering data centers. It is about preserving the integrity of electricity markets and ensuring affordable, reliable power for everyone.