Artificial intelligence is uniquely positioned to impact the electricity industry from both ends: as the technology driving large load demand growth and as a tool with the potential to make the power system more efficient.
Utilities are already using deep learning algorithms for load and weather forecasting, vegetation management, demand response, distributed resource aggregation and more. As neural networks become more deeply integrated into new software, they could unlock system flexibility and maximize existing assets. But they also require significantly more power to run, and they potentially introduce new security vulnerabilities.
This trendline gathers the best of Utility Dive’s recent coverage of artificial intelligence in the power sector.
Portland General Electric invests in AI-powered flexibility to speed data center connection
The utility is working with startup GridCARE to help data centers plug into transmission-constrained grids without waiting for new infrastructure, executives said.
By: Brian Martucci• Published Oct. 20, 2025
Portland General Electric has freed up more than 80 MW for data center interconnections next year using a new, artificial intelligence-enabled flexibility tool, the utility said Oct. 8.
PGE partnered with the California-based startup GridCARE, which uses AI, detailed hourly demand modeling and optimized flexible resources like batteries and onsite generators to find spare capacity. The added flexibility allows PGE “to interconnect multiple data center customers years earlier than initially expected,” the companies said.
“We are taking tools that have been in use for many years, that are well understood and well utilized … and moving them into the [grid] planning space,” GridCARE CEO Amit Narayan said in an interview. “In doing that, we realized we could open up quite a bit of capacity.”
PGE’s territory includes the Portland suburb of Hillsboro, long a global hub for chipmaker Intel. More recently, Hillsboro developed into a major data center hub thanks to its position at the mouth of an undersea fiber highway connecting North America and Asia, said Larry Bekkedahl, PGE’s senior vice president of strategy and advanced energy delivery.
Bekkedahl said Hillsboro alone has 800 MW of data center capacity operating today on a grid with a total system load of about 4.5 GW. PGE has around 3 GW of active data center load requests, more than 400 MW of which could energize by 2029, he added.
Bekkedahl said data centers eager to hook up to the transmission-constrained Pacific Northwest grid are coming to the realization that more onsite or nodal flexibility can speed up the process. That’s a big change from just a few years ago, when data centers presented as flat, round-the-clock loads that could push the system to its limits during the five or 10 days each year when demand peaks.
“We’d say to them, if that’s what you need, you’re going to have to wait until the transmission’s built… and so they realize they can flex themselves in our queue to get hooked up sooner,” he said.
The steady deployment of batteries, onsite generation and microgrids over the past decade has “created an opportunity that was not available to planners previously,” said Narayan, who founded and led virtual power plant platform AutoGrid before Uplight acquired it in 2023. Customers tend to use those resources for reliability and price hedging rather than mitigating strain on the grid, he said.
With electricity demand rising and speed to power top of mind for data center developers and tenants, Narayan said the moment for solutions like GridCARE’s has arrived. Every megawatt of additional capacity adds $30 million to $40 million in value for large data centers, he said.
And because the U.S. takes longer to build energy infrastructure than its principal geopolitical competitor, there’s a national security dimension to getting more out of the existing grid as well, he said.
“The U.S. is ahead in chips and algorithms, but the speed at which things are getting built in China and other places puts us at risk,” Narayan said.
Bekkedahl cautioned that grid flexibility can’t on its own address the challenges posed by headline-grabbing hyperscale AI data center proposals, which envision gigawatts of concentrated computing capacity. Former Energy Secretary Rick Perry’s Fermi America project, for example, has said it will deploy 11 GW of AI compute and energy infrastructure to support it in northwest Texas. Meta has more advanced plans for a data center in Louisiana that Entergy has said may require up to 2.2 GW of power.
“We do not solve the 2-GW data center problem,” he said.
But Bekkedahl said GridCARE’s tools gave PGE confidence to bring on data center loads ranging from 50 MW to 500 MW. It helped the utility and data centers work together to ramp up load and service in parallel and develop operational parameters for service agreements as new computing capacity came online, he said.
Narayan said it meant a lot that PGE took a chance on GridCARE. Famously risk-averse utilities like to see proof of concept before investing in non-wires alternatives, so this first real-world project is a big boost, he said. The grid tech company was developed out of the Stanford Sustainability Accelerator program.
“Once we show this can be done, it opens up a lot of opportunities in other places,” he said. “If we can get more from our existing infrastructure, that is helpful to all consumers using that infrastructure.”
Article top image credit: Courtesy of Google
Public disclosures of AI risk surge among S&P 500 companies
A report by The Conference Board shows companies are flagging concerns about cyber and reputational risk as they increase deployment.
By: David Jones• Published Oct. 15, 2025
More than seven of every 10 public companies on the S&P 500 now flag their use of artificial intelligence as a material risk in their public disclosures, according to a report released by The Conference Board.
That figure represents a sharp increase from just 12% in 2023, reflecting the rapid implementation of AI use among major enterprises.
“This is a powerful reflection of how quickly AI has developed from a niche topic to widely adopted and embedded in the organization,” Andrew Jones, a principal researcher at the Conference Board Governance & Sustainability Center, told Cybersecurity Dive via email.
AI has moved beyond the experimentation phase at major enterprises and is embedded across core business systems, including product design, logistics, credit modeling and customer interfaces, Jones said.
The report shows that corporate boards and C-suite leaders are addressing a range of risk factors in connection with AI deployment.
Reputational risk is the most widely disclosed issue, at 38%, according to the report. This reflects the potential impact of losing trust in a brand in the case of a service breakdown, mishandling of consumer privacy or a customer-facing tool that fails to deliver.
Cybersecurity risk is cited by 20% of firms. AI increases the attack surface, and companies are also at risk from third-party applications.
Legal and regulatory risks are also a major issue, as state and federal governments have rapidly attempted to set up security guardrails to protect the public, while providing enough support for companies to continue innovation.
While AI deployment is rapidly evolving in the enterprise, corporate leaders are still struggling to fully develop the governance structures to manage its use.
The PwC “2025 Annual Corporate Director’s Survey” shows only 35% of corporate boards have formally integrated AI into their oversight responsibilities, an indication that companies are still working to develop stricter guardrails.
“Directors recognize that AI brings both strategic opportunity and fiduciary risk, and many are starting to consider how to strengthen governance through regular education, clear oversight structures, and responsible-use frameworks," Ray Garcia, leader of PwC’s Governance Insights Center told Cybersecurity Dive.
Article top image credit: Getty Images
Trump wants to use AI to prevent wildfires. Utilities are trying. Will it work?
The president wants to remake wildfire policy with an increased emphasis on technology and a new, consolidated federal wildland fire service. Experts have mixed reactions.
By: Emma Penrod• Published July 18, 2025
The United States has already experienced more wildfires this year than it has over the same period in any other year this decade, according to the National Interagency Fire Center.
With the risk of fire expected to grow due to climate change and other factors, utilities have increasingly turned to technology to help them keep up. And those efforts could get a boost following President Donald Trump’s June 12 executive order calling on federal agencies to deploy technology to address “a slow and inadequate response to wildfires.”
The order directed agencies to create a roadmap for using “artificial intelligence, data sharing, innovative modeling and mapping capabilities, and technology to identify wildland fire ignitions and weather forecasts to inform response and evacuation.” It also told federal authorities to declassify historical satellite datasets that could be used to improve wildfire prediction, and called for strengthening coordination among agencies and improving wildland and vegetation management.
Additionally, the order laid out a vision for consolidating federal wildfire prevention and suppression efforts that are currently spread across agencies.The White House’s proposed 2026 budget blueprint would create a new, unified federal wildland fire service under the Department of Interior.
So far, Trump’s directive has drawn a mixed response from wildfire experts. While some said it could empower local governments and save utilities money, others said the order’s impact will be limited.
“I think some people read into the order more than is there, and some people read less,” said Chet Wade, a spokesperson for the Partners in Wildfire Prevention coalition. “I don't know exactly what will come of it, but getting technology into the right hands could be very helpful.”
Fire prevention goes high tech
Since the 2018 Camp Fire that bankrupted PG&E and set a nationwide precedent for suing utilities that trigger large fires, energy companies around the U.S. have invested billions of dollars in grid hardening and undergrounding power lines. Public safety power shutoffs are now routine during high-risk weather for many utilities, especially in the West. Many utilities have invested in new monitoring equipment and artificial intelligence to better detect and prevent fires.
Abhishek Singh, CEO and co-founder of AiDash, which builds software for monitoring wildfire risk, said technology has allowed utilities to make significant strides in areas like tree management, where AI paired with satellite imagery can help them identify high-risk areas. Trump’s executive order, he said, could make this technology even cheaper by releasing satellite imagery that companies like AiDash can use to train their AI models.
That, in turn, could make AI monitoring more accessible to local governments and firefighting agencies with limited budgets, Singh said. He believes more government monitoring will allow utilities to relax their own surveillance efforts, freeing resources for other concerns like hardening grid infrastructure.
Former FERC commissioner and chairman Neil Chatterjee, who has accepted a role as an advisor to AiDash, shares Singh’s views on the potential of technology to mitigate wildfire risk. The executive order, he said, could “modernize wildfire prevention and bring federal policy in line with the technology that is available.”
Technology advances as risk grows
While Wade said he hoped the federal government improves coordination between agencies and increases access to public lands for vegetation management, he expressed skepticism the executive order would have much impact.
For one, he said, it calls for giving more authority to local governments, but it doesn’t say anything about ensuring those communities have the resources they need to address wildfires.
The president’s budget blueprint calls for funding the new federal fire service with $3.7 billion, plus $2.8 billion for a wildfire suppression reserve fund. But it would eliminate funding from other agencies and programs that have historically played a role in addressing wildfires, including grants to support local firefighting efforts and the management of state and privately-owned forests.
“While the budget provides robust support for Federal wildland fire management activities alongside States and local partners, these partners should be encouraged to fund their own community preparedness and risk mitigation activities,” the proposed budget states.
U.S. Forest Service Chief Tom Schultz confirmed in testimony before lawmakers on the Senate Energy and Natural Resources Committee last week that the administration still has not released some firefighting funds to states for the current fiscal year, saying, “We’re still in discussion.”
Trump has walked back his earlier calls to eliminate the Federal Emergency Management Agency in recent days. But the tax and spending bill he signed into law July 4 cuts funding for research and forecasting of climate-related threats. Partners in Wildfire Prevention said it was too early to determine what the cuts mean for wildfire mitigation.
Andrew Dressel, a power industry consultant and vice president at Charles River Associates, cautioned that even as utilities prepare and technology advances, the risk of fire continues to increase as well.
While the order focuses on prescribed burns and wildland management, he said recent urban fires like those that devastated Los Angeles have called attention to the need to harden homes and yards against fire. They have also raised concerns about the role of idle power lines in starting fires. But moving, upgrading or removing power lines is expensive and time-consuming, and utilities are facing these costs at a time of unprecedented projected load growth.
“An executive order can only do so much,” Dressel said. “We need legislation, federal legislation or state legislation or both, to really move the needle on these things.”
Article top image credit: Justin Sullivan via Getty Images
AI-powered attacks rise even as companies prioritize AI security risks
Security executives are concerned about flaws in AI agents but also eager to see them replace humans in some roles, according to a new report.
By: Eric Geller• Published July 17, 2025
One in four chief information security officers, or CISOs, has experienced an AI-generated attack on their company’s network in the past year, and AI risks now top their priority lists, according to a report by cybersecurity firm Team8.
The true number of companies targeted by AI-powered attacks “may be even higher,” Team8 said in its report, “as most AI-driven threats mimic human activity and are difficult to detect without advanced metrics like time to exploitation and velocity indicators.”
AI outranked vulnerability management, data loss prevention and third-party risk on CISOs’ priority lists, according to the report, which is based on interviews with more than 110 security leaders from major enterprises.
AI is creating a range of new cybersecurity challenges for CISOs, from newly effective attacks to newly vulnerable technology platforms.
The issues dominating CISOs’ minds are securing AI agents (which 37% of respondents mentioned) and ensuring that employees’ use of AI tools conforms to security and privacy policies (36%). Beyond AI-powered phishing and malware development, the report shows, executives are also worried about the unintended security consequences of their own companies’ use of AI.
“Boards are pushing aggressively for enterprise-wide [AI] adoption, and security leaders are expected to enable, not block, this transition,” Team8 said in its report. “That puts CISOs in the hot seat: charged with mitigating risk in a technology domain that’s still poorly understood, moving fast, and lacking mature controls.”
Almost half of companies still require employees to get permission to use particular AI tools, an allow-listing approach that Team8 said could cause friction with non-security executives eager to expand their firms’ AI use. “The demand for effective ‘allow-by-default’ controls is acute,” Team8 observed, “as security teams grapple with shadow AI usage and the absence of enterprise-grade governance frameworks.”
On the flip side, CISOs are also eager to incorporate AI into their own operations. Nearly eight in 10 CISOs told Team8 that they expect security operations center roles to be the first positions replaced by AI. Nearly half of those CISOs said that reducing their employee count was a major factor in their experimentation with AI-powered SOCs. Executives also expect AI to replace humans in the areas of penetration testing (which 27% of CISOs cited), third-party risk assessments (27%), reviews of user access requests (24%) and threat modeling (22%).
In the areas of penetration testing and threat modeling — where there is a major workforce shortage because of the required skills and knowledge — Team8 said that AI agents could “unlock expert-level capabilities across a broader surface area.”
Already, nearly seven in 10 companies are using AI agents and another 23% are planning to deploy them next year, according to Team8’s report. Interestingly, given the proliferation of agentic AI vendors, more than two-thirds of the companies using or testing AI agents said they were developing them in-house.
Article top image credit: Getty Images
EPRI’s Open Power AI Consortium plans to fuel grid modernization with data sharing
The power sector-tech sector group wants to open data access to improve reliability.
By: Herman K. Trabish• Published April 29, 2025
Power system modernization with artificial intelligence, or AI, is being slowed by a lack of access to system data for AI training, tech and power sector experts agree.
“Data is AI’s oil, gas, wind and solar all wrapped into one,” Alexandr Wang, founder and CEO of tech market leader Scale AI, told the House Committee on Energy and Commerce April 9. “Every major AI advancement is grounded in data shaped by human expertise,” and “to win on AI, we must turn our data asset into an advantage.”
Utilities want to win with AI, too, power system executives and analysts agree.
“The grid is one of the most complex machines on the planet,” and is “primed for digitalization,” said Southern California Edison Senior Vice President, System Planning and Engineering, Shinjini Menon. Data to fuel that digitalization is vital “because there is not enough time, money, or resources on planet earth” to continue using past practices, she added.
The power and tech sectors have partnered to address barriers to data accessibility in the Open Power AI Consortium led by the Electric Power Research Institute, or EPRI. And regulatory and technology innovations are opening data access. But important disagreements about cybersecurity and utility customer rights must still be resolved, experts told Utility Dive.
What data and why
Many stakeholders see new AI capabilities as a tool to address the power system’s enormous reach and complexity.
“This is not your grandfather's AI because it is not just machine learning, but it still needs data to learn,” said EPRI President and CEO Arshad Mansoor. And that data has “many different sources, including regional system operators, utilities, researchers, private sector device manufacturers, third party analysts, and cloud and software providers,” he added.
“An estimated 95% of all data is behind utility cybersecurity and customer privacy walls,” added EPRI Executive Vice President and Chief Financial, Risk, and Operations Officer Swati Daji. EPRI’s Open Power AI initiative will “create, develop and curate data sets as an open repository of solutions for the whole industry.”
It will also create “a sandbox for testing and validating use cases,” and “a feedback loop for sharing lessons learned,” Swati said. The open repository data can be integrated with utilities’ protected data for further AI modeling, and, as grids modernize, utilities will be able to better manage the infrequent peak demands that require so much utility spending, she added.
Another initiative is the “One Digital Grid Platform” developed by component manufacturer Schneider Electric in partnership with Microsoft and data provider Esri as a “grid data hub,” said Schneider Electric’s Digital Grid CEO Ruben Llanes. It will “ingest data from utility smart devices” and “enable real-time insights, predictive analytics and automation,” he added.
Both initiatives will advance utilities’ capabilities to use data from AI-enabled grid edge intelligence to “disaggregate customer usage and orchestrate flexible consumption,” Menon said. That will, however, require “distribution system investments for wires, software and communications technologies,” she added.
Planned together, “incremental investments in control room and system edge technologies can keep costs of upgrades affordable,” Menon said. An AI-enabled utility system “might do 10 million computations in the time it now takes to do 10,000 computations,” which will allow “more comprehensive planning and more granular, real-time system operations,” she added.
Even with the 15 million customer-owned smart devices expected on SoCalEdison’s system by 2045, AI tools can monitor system data “and call human operators’ attention to where key decisions are needed,” Menon said.
Policy work is accelerating to enable greater data access by all power system stakeholders.
Permission granted by NARUC
Policy to drive data access
Data access has been the subject of recent federal, state and utility policy initiatives.
At the federal level, both the bipartisan AI framework proposed in 2023 by Sen. Richard Blumenthal, D-Conn., and Sen. Josh Hawley, R-Mo., and the PREPARED for AI Act introduced last year by Sen. Gary Peters, D-Mich., and Sen. Thom Tillis, R-N.C., include provisions for data protection and sharing. Neither has been enacted.
But in 2024, utilities and regulators continued to address access to customer data from advanced metering infrastructure, or AMI, at the state level, according to the most recent state policy action update from the North Carolina Clean Energy Technology Center.
According to the 2023 National Association of Regulatory Utility Commissioners Grid Data Sharing Playbook, all electric system data can be shared, but not personal, demographic, or program participation data that identifies customers. And that data access can enable system benefits, limit negative impacts and clarify the difference between “need to have” versus “nice to have” data, it added.
There are problems with access to both grid data and customer data, said Chris Villarreal, founder and president of consultant Plugged-In Strategies and a playbook co-author. Playbook practices can accelerate DER interconnection and enable economy-wide electrification by speeding data access and providing transparency and customer data protections where needed, he added.
Important data sharing policy work is ongoing in several states.
New York’s 2040 grid flexibility potential could avoid nearly $3 billion per year in power system costs, according to a new Brattle Group assessment for the state’s Grid of the Future proceeding. But implementing that flexibility potential “will require a massive amount of data, some of which is not measured or available,” said Brattle Managing Energy Associate and study co-author Akhilesh Ramakrishnan.
Utilities must invest to “measure and collect” data, and build “a single point of interaction and a single data format” to streamline access, Ramakrishnan added.
In Phase One of the California Public Utilities Commission’s R. 22-11-013 planning proceeding for DER growth, a Data Working Group was established to improve data access. It will “address rules and requirements on data access,” including standards for accuracy, customer privacy and utility compliance with regulatory requirements, the working group said.
Accessible real-time data would allow all available resources “to be used to balance supply and demand without spending hundreds of billions of dollars” for system upgrades, said Jon M. Williams, CEO of distributed battery provider Viridi.
In Massachusetts, lawmakers recently enacted Senate Bill 2967, which calls for “implementation of advanced metering data access protocols.” DER advocates endorsed the law, but utilities want a regulatory change.
Eversource Energy “fully supports” data access, and suggests “a common statewide portal using a virtual central data repository” rather than a “traditional database” that could “duplicate costs” and create “security vulnerability,” said Eversource Senior Vice President of Customer Operations, Digital Strategy, and Chief Customer Officer Jared Lawrence.
In Minnesota, regulators approved “open data access standards,” and approved $563.7 million for Xcel Energy to deploy smart meters and wireless communications to support them, according to Citizens Utility Board Executive Director Annie Levenson-Falk.
Utilities were also required “to collect and share aggregated or anonymized, disaggregated customer energy use data,” Levenson-Falk said. But Xcel’s denial of the first request for that data, as a potential threat to customer privacy or security, is now being challenged, she added.
A consistent focus of these proceedings is the use of AI-managed distribution system data to enable aggregated DER and virtual power plants, or VPPs, to be integrated into system operations.
Permission granted by SEPA
Distribution system data is key
There is growing stakeholder agreement that to obtain, share and use distribution system data, investments in DER management systems, or DERMS, will be necessary.
Each of the two types of DERMS is “a collection of multiple software systems interacting with utility operational and information technology systems and business processes,” according to an April Smart Electric Power Alliance, or SEPA, white paper. Together, they can “enable a new paradigm for grid operations” that “no single software system can deliver,” the paper said.
Edge DERMS “manages large numbers of behind-the-meter DER of multiple types, like batteries, thermostats, and EVs, from multiple DER manufacturers,” SEPA said. Grid DERMS “is tightly interconnected with a utility’s distribution [operating] systems and focuses on managing larger front-of-meter DER, and aggregated customer-owned DER,” it added.
“More actively managing and integrating DER can improve grid reliability, power quality, and system resilience, and provide economic value to both utilities and customers,” SEPA said. And “uncoordinated DER can increase peak system demand, create distribution system overloads or voltage violations, and increase the risk of cascading blackouts,” it added.
A Grid DERMS may be roughly $10 million to $20 million, and an Edge DERMS tends to be less than $10 million, depending on the utility system’s complexity, said SEPA Senior Director, Grid Strategy, and paper co-author Lakin Garth.
If DERMS deliver convincing system data, utilities will be able to use DER aggregations and VPP for the same flexible system services other dispatchable resources offer, utilities, DER advocates and analysts agree.
Edge DERMS are already disaggregating behind the meter customer energy usage to solve the peak demand challenge and alleviate the need for distribution system upgrades, said Jesse Demmel, chief technology officer for DER aggregator and Edge DERMS provider Uplight.
“Data to provide more services is available, but it is siloed across utility planning, operations and customer service departments and difficult to assimilate into a single platform,” Demmel added.
With investments in DERMS, emerging AI modeling can assimilate a “single pane of glass” platform with visibility that connects customers to the utility control room, Demmel said. The investments will pay for themselves by using data to identify the most cost-effective upgrades, he added.
“DERMS investments are definitely needed, along with investments in hardware, operations and measurement technologies,” and “DER will benefit the system,” agreed Brattle’s Ramakrishnan. AI may not, however, completely resolve utility concerns about security or reliability because blaming an operator’s decision on an AI recommendation “will not satisfy regulators or customers,” he added.
Plugged In Strategies’ Villarreal agreed. AI “might increase system modeling capabilities, but a single integration of all resources would be expensive and computationally complex,” he said.
AI operating at the distribution system level may, however, eliminate data sharing concerns because “it will follow the power flow to the data it needs to optimize system resources and show why that data should be available,” Villarreal added.
Article top image credit: Getty Images
PJM, Google partner to speed grid interconnection using AI
The initiative aims to integrate dozens of PJM interconnection-related databases and tools into a unified model of PJM’s network, helping to bring power supplies online faster.
By: Ethan Howland• Published April 10, 2025
The PJM Interconnection will work with Google to use artificial intelligence to potentially revamp and speed up its grid interconnection process, the grid operator said.
Tapestry, part of Alphabet, Google’s parent company, will work with PJM in a multi-year effort to integrate dozens of databases and tools the grid operator uses to study interconnection requests into a unified model of PJM’s network, according to Page Crahan, Tapestry general manager.
The unified model will allow project developers, grid planners and operators to access everything they need to make critical decisions in one place, Crahan said during a media briefing.
Tapestry expects the project will roll out in phases, starting this year, according to Crahan. First, Tapestry and PJM aim to improve the generator interconnection application process, she said.
It’s too soon to know when and to what extent PJM’s two-year interconnection process will be accelerated by the initiative, said Aftab Khan, PJM’s executive vice president for operations, planning and security.
“There is significant potential for us, but it's very difficult at this time … to quantify and say, ‘Hey, this is what we're going to achieve,’” he said.
As a “guide post,” Tapestry worked with Chile’s grid operator to reduce the time it took to finish certain planning processes from several days to a few hours, according to Crahan. “So, we're looking for a significant order of magnitude improvement to support the PJM planners,” Crahan said.
PJM is in the middle of its transition to a new, cluster-based interconnection process, according to Khan. As part of the transition, PJM last year finished a 26-GW fast-track interconnection review, he said, noting that the grid operator has signed generator interconnection agreements totaling about 50 GW that isn’t yet operating. PJM expects to complete its 46-GW transition Cycle 1 late this year and its 99-GW Cycle 2 a year later, according to Khan. It expects to begin using its new interconnection process early next year.
Advanced Energy United, a trade group for clean energy developers, praised the PJM-Tapestry initiative. “Embracing innovation like this is a critical step toward clearing the massive interconnection backlog and getting more clean energy projects online faster,” Jon Gordon, AEU director, said in a statement Thursday. “We’re hopeful that pairing advanced technology with continued market and policy reforms will unlock the full potential of clean, reliable, and affordable energy for the PJM region.”
The collaboration between PJM and Tapestry comes amid a surge in grid interconnection requests that has created major backlogs at grid operators. About 2,300 GW of power projects, mainly solar and batteries, are in grid operator interconnection queues across the United States.
PJM’s interconnection queue as of Jan. 31 included about 79 GW of solar, 37 GW of storage, 23 GW of wind and 4 GW of gas.
Last month, David Rosner, a Federal Energy Regulatory Commission commissioner, sent letters to grid operators that highlighted how grid interconnection automation software can be used to speed the interconnection process. “Achieving a truly fast and efficient interconnection process requires continuous innovation that leverages the latest software and automation solutions,” Rosner said in the letter.
Rosner said it took two years for the Midcontinent Independent System Operator to conduct a manual study of a large interconnection cluster; an application developed by Pearl Street reproduced the study in 10 days and arrived at largely similar results.
During the PJM media briefing, in response to a question about coal-fired generation, Amanda Peterson Corio, Google head of data center energy, said the company was committed to clean energy.
“We remain committed to our goals to decarbonize our electricity footprint, 24 hours a day, seven days a week, and matched on an hourly basis, by 2030,” Corio said. “We are focused on developing and seeding technologies that can provide firm, reliable capacity around the clock that is also clean.”
Article top image credit: Getty Images
Avista, PG&E, Ameren AI demonstrations show great potential – but are other utilities ready?
New artificial intelligence and machine learning algorithms can optimize complexities across the power system if utilities and regulators can make data more accessible — and protect it, experts say.
By: Herman K. Trabish• Published March 7, 2025
Utilities and system operators are discovering new ways for artificial intelligence and machine learning to help meet reliability threats in the face of growing loads, utilities and analysts say.
There has been an “explosion into public consciousness of generative AI models,” according to a 2024 Electric Power Research Institute, or EPRI, paper. The explosion has resulted in huge 2025 AI financial commitments like the $500 billion U.S. Stargate Project and the $206 billion European Union fund. And utilities are beginning to realize the new possibilities.
“Utility executives who were skeptical of AI even five years ago are now using cloud computing, drones, and AI in innovative projects,” said Electric Power Research Institute Executive Director, AI and Quantum, Jeremy Renshaw. “Utilities rapid adoption may make what is impossible today standard operating practice in a few years.”
Concerns remain that artificial intelligence and machine learning, or AI/ML, algorithms, could bypass human decision-making and cause the reliability failures they are intended to avoid.
“But any company that has not taken its internal knowledge base into a generative AI model that can be queried as needed is not leveraging the data it has long paid to store,” said NVIDIA Senior Managing Director Marc Spieler. For now, humans will remain in the loop and AI/ML algorithms will allow better decision-making by making more, and more relevant, data available faster, he added.
In real world demonstrations, utilities and software providers are using AI/ML algorithms to improve tasks as varied as nuclear power plant design and electric vehicle, or EV, charging. But utilities and regulators must face the conundrum of making proprietary data more accessible for the new digital intelligence to increase reliability and reduce customer costs while also protecting it.
The old renewed
The power system has already put AI/ML algorithms to work in cybersecurity applications with cutting-edge learning capabilities to better recognize attackers.
Checkpoint Software, the global AI chip maker NVIDIA’s security provider, is working with standards certifier Underwriters Laboratories on new levels of security for consumer devices, said Peter Nicoletti, Checkpoint’s global chief information security officer. Smart devices “will be required to meet a security standard protecting against hackers during software updates,” he said.
Another proven power system application for advanced computing is market price forecasting based on weather, load and available generation.
Amperon has done weather, demand and market price forecasting with AI/ML algorithms since 2018, said Sean Kelly, its co-founder and CEO. But Amperon’s short-term modeling now “runs every hour and continuously retrains smarter and faster using less energy, combining the strengths from each iteration in a way that humans could never touch,” he added.
Hitachi Energy’s Nostradomus AI forecasting tool, with the newest AI/ML capabilities, “has improved price forecasting accuracy 20% over human market price forecasting” since November, said Jason Durst, Hitachi Energy general manager, asset and work management, enterprise software solutions.
AI/ML-assisted technology has also emerged “as a critical pillar of wildfire mitigation strategy,” said Rob Brook senior vice president and managing director, Americas, for predictive software provider Neara. It helps utilities identify wildfire risks “across their networks by proactively assessing more variables than a human can assimilate,” he added.
AI/ML algorithms have, in the last year, accelerated the use of robotics for solar construction, said Deise Yumi Asami, developer of the Maximo robot for power provider AES. The six months once needed to retrain Maximo have been eliminated because its AI/ML algorithms autonomously learn the unique characteristics of each solar project before it begins work, she added.
The new and more autonomous AI/ML capabilities will offer “increased stability, predictability, and reliability at scale,” said Nate Melby, vice president and chief information officer of Midwestern generation and transmission cooperative Dairyland Power Cooperative. Management of system complexity “is where AI could shine,” he added.
Utilities are increasingly using new AI/ML capabilities to meet the accelerating complexities of variable loads, proliferating distributed energy resources, or DER, and other power system challenges.
Optional Caption
Permission granted by PG&E
New needs, new capabilities
A power system without adequate flexibility “can lead to decreased reliability and safety, increased operational costs, and capacity costs,” Pacific Gas and Electric, or PG&E, concluded in its 2024 R&D Strategy Report. “AI/ML and other novel technologies can not only bolster our immediate response capabilities but also inform long-term planning and policymaking,” it added.
PG&E’s total electricity consumption will double in the next five to 10 years, but it can limit peak load growth to 10% with AI/ML-based grid optimization of DER on the existing infrastructure, PG&E CEO Patti Poppe said at the utility’s November Innovation Summit.
Access to AI/ML algorithms is now commercially viable, and their capabilities can optimize multiple large scenarios in parallel to support decision-making for the power system’s millions of variables, NVIDIA’s Spieler said. The algorithms can also write software code to allow utilities to use “the petabytes of stored system data they have but have not used to optimize more operations,” he added.
Utilities can upload and query their internal knowledge bases of research papers, rate cases and analyses of wildfire and safety issues into a generative AI model, Spieler said. The query responses can then explain system anomalies based on performance and maintenance histories or deliver needed data and precedents for writing general rate case and other regulatory proceeding filings, he added.
Utility demonstrations are verifying the new AI/ML capabilities.
Optional Caption
Permission granted by PG&E
From DER to nuclear plants
Several demonstrations have focused on how AI/ML algorithms can optimize distribution system resources.
Utilidata’s Karman software platform and an NVIDIA GPU-empowered chip are embedded in Aclara smart meters and will soon be in other distribution system hardware, said Utilidata VP, Product, Yingchen Zhang. Karman reads high resolution distribution system raw data 32,000 times per second and identifies individual customer electricity usages in real time, he added.
A real world demonstration, with Karman reading and reacting to granular real-time data, found utilities can quickly stabilize EV charging-induced voltage fluctuations, a University of Michigan-Utilidata study noted.
Within one year of implementing software from data disaggregation specialist Bidgely, Avista Utilities reduced service calls in response to high bill complaints by 27%, reported Avista Corp. Products and Services Manager Andrew Barrington. Instead of a service call to check the customer’s meter, Bidgely’s software analysis identified the customer usage causing the bill spike, he added.
A Bidgely disaggregation analysis evaluated EV charging for 10,000 Ameren Missouri customers, reported Caroline Cochran, its VP, Delivery, in a Stanford-EPRI conference presentation. The analysis identified the 73 customers that could utilize better management to avoid or defer costly infrastructure expenditures that otherwise would have been needed to manage EV charging loads, she added.
Bidgely’s similar 2023 disaggregation analysis of 100,000 NV Energy EV charger owners identified “hot spots where infrastructure investment will likely be needed first,” which limited larger distribution system capital investment, reported the Smart Electric Power Alliance’s January AI for Transportation Electrification Insight Brief.
AI/ML algorithms are also finding efficiencies that reduce nuclear power plant costs and safety challenges.
PG&E is using Atomic Canyon’s Generative AI software, trained to Nuclear Regulatory Commission standards, at its Diablo Canyon Nuclear Power Plant, said Nuclear Innovation Alliance Research Director Patrick White. And innovative AI/ML-based plant designs, operations and predictive preventive maintenance are limiting costs and increasing plant safety, he added.
There are, however, things utilities must do to more fully take advantage of the accelerating AI/ML capabilities, utilities and providers recognize.
Permission granted by Bidgely
The work ahead for utilities
Effectively capturing the benefits of AI/ML algorithms begins with recognizing the potential and acquiring and using the right hardware and software, utilities and third parties say.
Avista’s successful adoption of third-party AI/ML “began with a mindset,” said Barrington. The key questions were “how to enhance customer engagement, how to integrate customer data with system operations, and how to enhance system visibility and enable proactive strategies,” he added.
AI/ML algorithms are now extracting real-time data and making actionable suggestions, Utilidata’s Zhang said. But “utilities cannot take advantage of the suggestions because they do not have the technology and communications ecosystems in place,” he added.
Utilities need communications technologies, advanced metering and edge computing infrastructure, and data processing and storage technologies, EPRI’s Renshaw said. And, at the distribution system level, utilities should also have software that can be securely updated for new technologies as customers adopt them, Utilidata’s Zhang added.
Balancing the protection of security and customer privacy with the need to provide data to train AI/ML algorithms continues to be a significant challenge.
Protecting utility data requires “strong cybersecurity practices,” said Dairyland Power’s Melby. But utilities need to access and manage data in a way “AI platforms can leverage,” he added.
Recently, “utilities have begun doing penetration testing to prove their data is as secure in our system as in theirs,” said Bidgely’s Cochran. They also “have developed AI committees to do extra thorough reviews of the users of their data,” she added.
“There is good reason for utilities to be conservative about data privacy, but AI/ML power system applications are not yet any threat,” Utilidata’s Zhang said. Federated learning or foundation models are ways to both protect privacy and provide data for algorithm training, he added.
Federated learning allows utilities to protect proprietary data by building synthetic models of their data about specific challenges that can be shared at a secure location for further training, Zhang said.
But some think federated learning may be too limited for power system complexities. Foundation models would use orders of magnitude more data that has been anonymized and pre-trained with as much power system information as possible, EPRI’s Renshaw and others said.
Utilities may be able to create a foundation model to enable shared learning and protect their data, said PG&E Senior Director of Grid Research, Innovation and Development Quinn Nakayama.
“The bottom line is — gather more high-quality data, use, store and protect it properly, and feed it into models that are trained and updated for the right tasks,” Renshaw concluded.