Vijay Narayan is business unit head of manufacturing, logistics, energy and utilities, and Syama Sundar Peesapati is SBU head for energy and utilities at Cognizant.
What happens when the power grid isn’t built for what’s coming next? After more than a decade of flat growth, U.S. electricity demand jumped 3% in 2024, the fifth-largest increase this century, driven by electrification, energy-hungry data centers and shifting populations. Much of the infrastructure that utility companies rely on today can’t handle that kind of load. At the same time, these companies are juggling aging systems, tight rate-case constraints and growing pressure to do more with less.
Many are turning to generative AI to help, but the road to real value isn’t straightforward. With up to 40% of large-scale AI projects expected to fail by 2027, the risk of wasted investment is real — especially in a sector where reliability, safety and public trust leave no room for error.
Critical success factors for utility-scale AI implementation
Ask anyone who’s worked in this industry long enough, and they’ll tell you: Most of the systems utility companies use are siloed and weren’t designed to work together.
The geographic information system shows where the substation sits. The enterprise asset management system tracks what that transformer is made of, warranty status and usage. Then there’s a separate SCADA system to manage how power flows. If something goes wrong, operators must triangulate across all three just to diagnose the issue and respond.
That’s a basic use case and it’s already too complex. But these systems are also old, the data is messy and the level of digital maturity across the industry is generally poor. If your data is inaccurate or inconsistent, generative AI may hallucinate the data, and the model will provide an incorrect answer.
Even if the tools work, there’s still the question of who’s responsible for funding the infrastructure to support them. Utility companies can’t just raise rates to cover upgrades. In regulated states, they have to go through the rate case process and make a case to regulators, who often push back. Many companies are under pressure to modernize the grid but lack the financial investment to make it happen. That’s why AI can’t be a moonshot. It must free up dollars elsewhere in the business. Otherwise, it’s just another stalled initiative.
The good news is utility companies don’t need a perfect system or a blank check to make AI work. They just need to start small, prove value and build the right foundations along the way.
Start with the right architecture: Pilots and proofs of concept are a good start, but to scale AI across the enterprise, utility companies need a solid foundation. That begins with an enterprise AI architecture aligned to business processes. Without that alignment, organizations risk solving only a small part of the problem.
For instance, onboarding commercial electricity customers, such as factories, involves tracking requirements from the number of substations to air conditioning load, across various inputs, including spreadsheets, emails and even handwritten notes. Traditionally, a data entry operator would manually enter all of this into backend systems. But with the right AI architecture in place, utilities can ingest all those formats regardless of language or file type and automatically convert them into a processable order. That level of complexity can’t be automated in a silo. It requires infrastructure built to handle variability from the start.
Define a real data strategy: Equally important is a clear data strategy. If data quality doesn’t improve, it means feeding bad inputs into smarter systems. Successful utility companies are investing in cleaning, standardizing and governing their data before deploying new tools. That includes aligning AI to their most critical workflows rather than layering chatbots onto broken processes.
AI is not an IT project; it’s a business transformation. Forward-looking utility companies are putting dedicated, cross-functional teams in place, often led by chief data or AI officers that work together with business users to define high-impact problems and continuously iterate solutions. These teams are responsible for managing and enhancing generative AI deployments over time, ensuring the tools evolve with business needs.
Bake in governance from day one: Utility companies operate under more scrutiny than most. The moment data is exposed beyond the organization’s walls, some control over the outcome generative AI produces is lost. That’s why human-in-the-loop systems, permission controls and auditable decisions can’t be an afterthought. Compliance must be built into the design, especially when AI models rely on third-party sources like satellite imagery and weather data.
For example, utilities are using AI to better manage vegetation trimming, a task that can cost billions annually. By analyzing visual data from Google Earth alongside rainfall trends, AI can identify the regions most likely to see overgrowth and prioritize which areas need trimming first. This helps crews operate more efficiently while keeping infrastructure safe.
A similar approach supports storm response: By predicting which substations or power lines are at greatest risk during a storm, utilities can pre-position equipment in the impact zone, accelerating recovery after the event.
These use cases only work when governance is considered from the start, ensuring external data sources are reliable, risks are understood and AI-generated insights are trustworthy and actionable.
Based on the rapid demand we're seeing from utilities, the next 10 years will hinge on how companies build smarter grids and run smarter organizations. As generative AI matures, the ability to integrate proven agents into complex ecosystems will be key to scaling innovation and sustaining long-term progress.