The rapid expansion of artificial intelligence (AI) applications is fueling an unprecedented wave of data center development across the United States. In Northern Virginia—dubbed “Data Center Alley”— transmission-connected facilities have been reshaping the energy landscape for some time now. But as AI adoption picks up, new data center hotspots like those in Phoenix, Arizona and Dallas, Texas, have been recognized as growing markets. This trend doesn’t show signs of slowing down as other regions are seeing pointed growth. To this end, the Department of Energy is focused on accelerating site selection for federally supported AI infrastructure, leaving utilities to face a dual challenge: managing explosive load growth and adapting to the rise of behind-the-meter generation.
As AI-driven data centers scale rapidly, the traditional utility model is under pressure. These facilities demand immense, fluctuating power—often faster than utilities can build infrastructure to support them. To meet power quality and reliability targets and to avoid grid bottlenecks, developers are increasingly turning to on-site generation. But this shift raises deeper questions about long-term integration, grid coordination and regulatory oversight. Across the U.S., some utilities and policy experts are calling for greater transparency, formal registration and collaborative planning to ensure that large loads like data centers can connect safely and sustainably to the grid.
Why Data Centers Are Turning to On-Site Generation
AI-powered applications demand immense computational resources, translating into unpredictable and rapidly fluctuating power consumption. This “load jitter” behavior—where power draw can spike or dip rapidly—makes it difficult for utilities to forecast and allocate capacity. Compounding the challenge is a mismatch in the speed at which data centers are being built compared to the five to seven years utilities typically require to expand transmission infrastructure. In turn, data center developers are finding they can deploy on-site generation in just two to three years, bypassing traditional grid constraints.
On-site generation solutions generally include natural gas turbines (often in combined-cycle configurations), diesel backup systems and renewables paired with a battery energy storage system (BESS). A notable example of this is the recently announced Utah High Performance Compute Data Center Campus in Millard County, led by Joule Capital Partners, Caterpillar Inc. and Wheeler Machinery Co. This project leverages upward of 4GW of behind-the-meter generation, combined heat and cooling and a 1.1GWh BESS. While this is one of the largest behind-the-meter proposals we’ve seen to date, this and smaller setups like it have the potential to offer speed, control, reliability and timeline to construct that is more favorable to developers.
This “prosumer” model— where an entity not only consumes electricity but also generates it, often independently of traditional utility providers—isn’t limited to on-site generation. Some data centers are entering into direct power purchase agreements (PPAs) with independent power producers (IPPs), bypassing utilities altogether. These arrangements represent a broader shift where organizations take energy procurement into their own hands, whether through on-site generation or strategic partnerships.
Utility Innovation in the Face of Growing Demand
Reliability is paramount for data centers, many of which aim for “five nines” (99.999%) of uptime, which equates to just five minutes of downtime per year. Utilities, constrained by aging infrastructure and competing customer demands, struggle to guarantee this level of service. At the same time, the broader grid is under pressure. Retiring power plants, transmission bottlenecks and slow interconnection timelines are limiting available capacity for new resources. Additionally, regional transmission organizations like California Independent System Operation (CAISO), CAISO, PJM Interconnection and the Electric Reliability Council of Texas (ERCOT) forecast peak shortages as early as 2027, suggesting that this mismatch between supply and demand isn’t going away any time soon.
Yet there are developments to unlock latent capacity in existing infrastructure through dynamic line ratings and ambient-adjusted ratings and other grid-enhancing technologies. These innovations could accelerate interconnection timelines and reduce the need for costly new builds. As utilities explore these solutions, they demonstrate a proactive approach to meeting the challenges posed by electrification, decentralization and prosumer growth.
Partnering for Longevity
As utilities innovate to expand capacity, data center developers continue to prioritize speed through on-site generation. This raises important questions: will these systems remain the primary power source, or transition to backup roles once grid interconnection is achieved? What are the implications of utilities acquiring and integrating behind-the-meter generation into the grid?
To navigate these questions, utilities and developers must evolve their planning and collaboration strategies. While behind-the-meter solutions offer speed to market, grid integration enhances long-term reliability and sustainability.
The Case for Registration and Transparency
One promising strategy for long-term reliability is accurate load registration, a process where developers submit detailed forecasts of their expected power usage. Today, speculative development and vague load profiles often hinder this process, making it difficult for utilities to plan effectively.
A case in point: ERCOT’s early demand forecasts were significantly inflated due to letters of interest that later proved unrealistic. After revising its approach, ERCOT reduced its projections by 50%, a shift that stemmed from improved feasibility assessments and more accurate forecasting methods. ERCOT now mandates that entities planning to interconnect loads of 75 MW or more follow its Large Load Interconnection process.
Florida Public Utilities Company (FPUC) offers a Facility Interconnection Requirements framework for facility interconnection, covering generation, transmission and load-serving facilities. The framework aligns with industry standards, addresses technical specifications and emphasizes coordinated studies, communication with balancing authorities and annual documentation updates.
Nationally, organizations like GridLab are advocating for stronger utility practices to manage large load integration. Their recommendations emphasize early registration, visibility into load characteristics and minimum technical standards. GridLab identifies key challenges such as speculative development, lack of technical data and opaque interconnection processes—all of which pose risks to grid reliability. To address these, they suggest utilities develop accurate load models, conduct rigorous interconnection studies and monitor performance after connection.
The examples above uncover common themes: registration and load transparency are essential for realistic grid planning; and coordination among developers, utilities and regulators is key to streamlining interconnection while safeguarding infrastructure. As large loads become more prevalent, these evolving practices will play a vital role in maintaining grid stability and efficiency. Data center developers must play their part in forecasting their power requirements with transparency, coordination and realistic—not speculative—planning.
Toward a Collaborative Future
As data centers begin managing their own energy portfolios, utilities face the risk of being sidelined. To remain relevant, utilities must evolve from traditional energy suppliers to strategic partners in grid resilience. This means offering value that developers can’t easily replicate: deep local grid expertise, regulatory navigation and meaningful community integration.
Transparency and collaboration will be essential as utilities and data center developers explore this new landscape that demands flexibility and a shared commitment to building a resilient and responsive energy future.