Scott Engstrom is chief customer officer for GridX.
Somewhere in your service territory, a data center developer is scouting locations. Maybe several. And somewhere in your utility organization, people are worried about what that means for your existing customers, the impact of this new load growth on the grid, and your next rate case.
That worry is understandable. My three decades in this industry have made it clear that every major load growth event — from the adoption of air conditioning to the introduction of electric vehicles — carries real financial risk. Utilities build infrastructure on the promise of sustained revenue, and when that promise falls short, utility customers can end up bearing the brunt of the shortfall.
However, what often gets overlooked in the current debate around data centers is that they don't have to be a liability. With the right rate structures and contractual mechanisms, these facilities can strengthen grid operations while paying their fair share of costs.
The conventional wisdom treats data centers as inflexible monsters that demand power around the clock with zero give. That characterization made sense a decade ago, but it misses how sophisticated hyperscale operators actually run their facilities.
Modern data centers maintain load factors above 75%, which is far more predictable than most commercial loads. Their cooling systems, which account for a substantial portion of total energy consumption, can be modulated without affecting core computing operations. Many facilities already have backup generation and battery storage that could serve as dispatchable grid resources during peak events. The largest operators can shift computational workloads across geographic regions, thereby reducing localized demand without disrupting service.
None of this happens automatically. These capabilities only translate into grid benefits when tariff structures create meaningful economic incentives. Without those incentives, operators have no reason to participate in demand response or deploy their backup assets to support the system. The flexibility sits idle because traditional rate design doesn't recognize or reward it.
The legitimate concern around data centers isn't about flexibility. It's about cost recovery. Utilities must cumulatively invest significantly in transmission and distribution upgrades to serve these loads, and if projects stall or operators leave, existing customers absorb those stranded costs.
Several utilities have begun addressing this issue through contractual safeguards that warrant broader adoption. Ameren Missouri now requires minimum contracts with exit fees that cover undepreciated infrastructure costs and a commitment to consume portions of contracted capacity. AEP Ohio's data center tariff mandates either a minimum credit rating or financial collateral equal to half of the total minimum charges for the contract term. These aren't punitive measures. They're basic cost-of-service protections that prevent one customer class from subsidizing another.
The irony is that most sophisticated data center operators prefer these arrangements. Long-term contracts provide certainty for their own capital planning. Minimum load commitments align with how they actually operate. Credit requirements are a standard practice in commercial real estate. What these customers can't tolerate is regulatory uncertainty or rate structures that change arbitrarily.
Once baseline cost protection is established, utilities can design tariffs that go further by transforming data centers from large loads to be served into resources that enhance grid performance.
Interruptible demand riders offer a straightforward path. By offering lower rates in exchange for the ability to curtail loads during system emergencies, utilities can achieve dispatchable load reduction without the need to build new generation. Data centers equipped with backup generation can maintain operations without overly straining the grid, which is a mutually beneficial outcome.
More sophisticated approaches leverage real-time pricing to encourage load shifting. When operators see actual cost signals — not averaged rates that obscure system conditions — they can schedule intensive computing tasks for off-peak hours. Critical Peak Pricing during system stress events creates even stronger incentives for demand reduction. The PJM region has seen growing interest in these mechanisms precisely because the growth of data centers there has made grid management more challenging.
Perhaps most promising is compensating data centers for building and implementing on-site generation assets as virtual power plants during peak periods. Duke Energy's Accelerating Clean Energy tariffs represent early steps in this direction, creating pathways for large customers to support grid reliability while advancing clean energy goals. When a data center runs its backup systems to offset load during a summer peak, everyone benefits. The operator receives compensation, the utility avoids expensive wholesale purchases and system reliability improves.
The ability to rapidly model, simulate, implement and refine complex tariffs while understanding system-wide impacts has become a strategic capability rather than an operational detail. Utilities need platforms that can analyze customer data to identify revenue impacts and equity concerns before filing new rate cases. They need billing systems that can handle intricate calculations for real-time pricing, multi-layered demand charges and virtual power plant compensation without manual intervention.
Most critically, utilities need the ability to iterate. Initial rate designs rarely achieve perfect outcomes. Load patterns shift, customer behavior adapts and policy priorities evolve. The utilities that thrive will be those that can measure actual results against projections, identify where rates aren't performing as intended and refine structures quickly rather than waiting for the next rate case cycle.
The current public discourse around data centers and grid impact has become unnecessarily binary. Either these facilities represent an existential threat to grid stability and customer affordability, or they are economic development opportunities that deserve accommodation regardless of cost. Neither framing serves utilities or their customers well.
A more productive approach starts with recognizing that data center load growth is happening regardless of utility preferences. The relevant question isn't whether to serve these loads, but how to structure that service in ways that protect existing ratepayers, recover actual costs and leverage inherent flexibility for broader system benefits.
That requires moving beyond traditional rate design assumptions, which were built for a different era. It means investing in analytical capabilities that reveal customer impacts before implementation rather than after. It demands attention to equity concerns, particularly for low- and moderate-income households who have the least ability to absorb unexpected bill increases. It requires regulatory partnerships focused on achieving equitable outcomes, supported by rigorous data rather than rigid adherence to historical precedent.
The utilities that figure this out will have navigated one of the most significant load growth events in decades while maintaining rate equity and building stakeholder trust. Those that don't will spend years in contentious rate cases, managing customer complaints and explaining why their residential customers are subsidizing infrastructure they never asked for. The path forward isn't about opposing data center growth or embracing it uncritically. It's about getting the analytics and rate mechanisms right, so every customer class benefits, and so rates that look good on paper actually deliver equitable outcomes in practice.