Mark Knipfer leads data center services at Integrated Environmental Solutions, where he works with engineering teams and developers to evaluate performance and infrastructure strategies for high-performance facilities.
Utilities are being asked to plan for a new class of electricity demand, one that behaves less like traditional industrial load and more like a dynamic, high-density energy system.
The rapid growth of AI-driven data centers is introducing pressure on grid infrastructure, the likes of which have never been seen before. Estimates from the Electric Power Research Institute suggest that data centers could account for 9% to 17% of total U.S. electricity demand by 2030, up from roughly 3-4% today.
In many parts of the country, large facilities are requesting hundreds of megawatts of capacity, often on accelerated timelines. At the same time, utilities are being asked to make long-term investment decisions based on load profiles that are increasingly difficult to predict.
What’s happening as a result of this AI-driven demand is not simply a question of scale, it is a shift in the nature of demand itself.
Historically, utilities have planned around relatively stable demand patterns. Industrial loads tend to follow predictable operating schedules, and even large commercial loads exhibit gradual variation over time.
But AI data centers are different. High-density compute clusters can experience swings in power demand of as much as 40–50% over short periods, depending on workload intensity. These fluctuations create rapid changes in cooling and power requirements that ripple through the entire facility. From a grid perspective, this introduces a level of variability that traditional forecasting methods are simply not designed to capture.
The result is growing uncertainty at exactly the moment when utilities are being asked to commit to large-scale infrastructure investments.
The strain on planning and interconnection
This uncertainty is already showing up in interconnection queues and planning timelines. Utilities must evaluate whether they can reliably serve large, concentrated loads without compromising system stability. Individual projects can require 100-500 MW of capacity, with some multi-phase developments targeting gigawatt-scale demand, over time. Compounding this, utilities face pressure to accelerate connection timelines, to maintain competitiveness in attracting data center investment.
In many cases, developers are being asked to provide more detailed evidence of how their facilities will behave in operation, not just at peak demand, but across a range of real-world conditions. This reflects a broader shift: it is no longer enough to specify maximum load; utilities increasingly need to understand how that load will vary over time.
That creates a challenge for both sides. Developers must provide credible, defensible demand profiles, while utilities must incorporate these profiles into planning processes that were built around very different, largely static assumptions.
The stakes are high, for all parties. With data center electricity demand expected to double or even triple in the coming decade, the consequences of misjudging load profiles are becoming more significant. Overestimating demand can lead to overbuilt infrastructure, stranded assets and higher costs for ratepayers. Underestimating demand, on the other hand, risks congestion, reliability challenges and costly upgrades after the fact.
All the while, policymakers and regulators are understandably paying closer attention to how data center growth could affect electricity prices, grid reliability and long-term resource planning. In some regions, interconnection queues already exceed 2-3 times current peak demand, according to independent system operator data, and in several states, questions are being raised about cost allocation and whether large new loads should bear a greater share of infrastructure investment.
In this environment, uncertainty is as much a financial and regulatory issue as it is a technical one. Addressing this challenge requires a shift in how large loads are evaluated.
Rather than relying solely on peak-demand estimates, as has historically been the case, utilities and developers need a more complete understanding of how data center loads behave over time. This includes not only average consumption, but variability, ramp rates and the interaction between IT workloads, cooling systems and environmental conditions.
Consequently, a growing number of developers are beginning to use more advanced modeling approaches to generate these insights. Increasingly, this is being supported by physics-based simulation tools that allow planners to evaluate how different design and operational strategies perform under real-world conditions. By simulating facility performance across a full year of operation, they can produce more realistic demand profiles that reflect how systems actually behave under varying conditions.
For utilities, this type of information can support more informed planning decisions. It provides greater visibility into how large loads will interact with existing infrastructure and can help identify potential constraints before they become critical issues.
A shared responsibility
Ultimately, the challenge of integrating AI-driven data center demand into the grid is a shared one.
Utilities, developers and regulators all have a role to play in ensuring that new capacity can be delivered in a way that is both economically and operationally sustainable.
For utilities, this means evolving planning frameworks to account for more dynamic and uncertain load profiles. For developers, it means providing clearer, evidence-based insight into how their facilities will perform in operation. And for regulators, it means balancing the need for infrastructure investment with the protection of ratepayers.
AI is reshaping every industry it touches, and is now beginning to reshape the energy system as well. The question is whether planning approaches will evolve quickly enough to keep pace.