Feature

How California's utilities are mapping their grids for distributed resources

Utilities and DER companies are working together to understand where the grid can handle more customer-sited resources

Delivering electricity to everyone, all the time, is no small job — and the rise of distributed energy resources is making it exponentially more challenging.

California’s electricity system was built to deliver power from a few hundred central station power plants. But the rapid growth of rooftop solar in the state means that thousands of new generators are being added each week, putting strains on distribution systems originally designed for one-way power flows.

If this growth in distributed energy resources (DERs) is to continue, utilities must better understand where customer-sited resources can be added to their grids without causing reliability problems.

In the state’s integration capacity analysis (ICA) proceeding, engineers from the state’s three big investor-owned utilities (IOUs) are addressing that question — working with DER providers to develop a methodology to identify where DERs can connect to the distribution system, and how much each feeder can handle.

Those findings will be made public and inform a parallel proceeding on how to monetize the value of the DERs.

“The utilities are being highly collaborative and the hard work they are doing at the edge of this field is inventing new ways of doing this,” said Sky Stanfield, senior counsel Interstate Renewable Energy Council (IREC), a participant in the proceeding.

In responding to “detailed, technical questions” of DER vendors, utility engineers have learned what features of the locational value, called hosting capacity analysis (HCA), are “most critical,” said Erik Takayesu, director of grid modernization at Southern California Edison (SCE).

In this “collaborative atmosphere,” probing by DER providers in the ICA working group has resulted in the development of analytic tools “that will be highly useful to DER interconnection and integration,” added Amber Albrecht, spokesperson for San Diego Gas & Electric.

Finding the right method for hosting capacity analysis is the primary hurdle that stands in the way of two basic objectives defined in the ICA working group’s just-released draft final report. The final version is due March 15.

The first objective is to improve the DER interconnection process by identifying exactly how much DER can be added at any interconnection point on the distribution system. The second is to bring DER into utilities’ distribution system annual planning by identifying the best sites for future DER development.

These objectives will be delivered by the IOUs as publicly-accessible online system maps, embedded with detailed, downloadable system data.

Obtaining this locational view of the IOUs’ systems is one of the two key steps necessary to bring DERs into the power sector marketplace. The other will come from work being done in parallel by the Locational Net Benefit Analysis (LNBA) working group.

Together, the conclusions from the two working groups will present a clearer picture of where distributed resources can be added to the grid and what they are worth at each location. That will allow regulators, utilities, and providers to see how and where DERs can cost-effectively serve the distribution system, and how they fit into long-term planning.

By identifying where and when DERs can deliver benefits to the broader power system, policymakers will be able to better guide California’s investments in grid modernization. But before regulators authorize spending ratepayer money on tomorrow’s grid, the utilities need to know where to invest. That requires a methodology for mapping the system.

HCA methodology: ‘Streamlined’ vs. ‘iterative’

California’s IOUs are further along than other U.S. distribution system operators in mapping the granular distribution grid data, IREC’s Stanfield said. But DER providers still do not know where they can interconnect without complicated, costly, time-consuming utility reviews, she added.

SDG&E’s Albrecht offered some perspective on interconnection costs. Though circumstances vary, the SDG&E Fast Track study requires at least an $800 deposit and a 15-day review. But it can require another $2,500 and extend another 20 days. A full detailed study requires a deposit of between $10,000 and $250,000 and takes 60 days or longer.

If the ICA establishes an accurate methodology, interconnections can more often be fast tracked, said Brad Heavner, policy director at the California Solar Energy Industries Association (CalSEIA). “We need to get to a point where the analysis is feasible from the utility perspective but usable from the customer perspective.”

An early version of the Electric Power Research Institute (EPRI) Distribution Resource Integration and Value Estimation (DRIVE) methodology is partly the basis for one methodology being studied in California. A later version was tested by Xcel Energy, and the most recent version has been designated by New York’s utilities as their hosting capacity analysis (HCA) methodology.

“Everyone agrees the methodology needs to be as accurate as possible but also needs to be usable by developers,” said EPRI Vice President Mark McGranaghan. “It needs to be as location-specific as possible and needs to allow as much DER as possible. A methodology that gives answers that are too conservative or limits DER is not acceptable.”

California’s utilities were directed, in an Assigned Commissioner Ruling (ACR) from CPUC President Michael Picker, to develop demonstration projects that would test two very different methodologies.

A “streamlined” HCA establishes a distribution system level of granularity from which the utilities model and extract power system data, according to the working group draft final report. The utilities then extract the power system criteria from which they determine node-by-node DER capacity to display in an online map.

This method is abstract. It applies a set of equations and algorithms to a baseline power flow simulation of the circuit being analyzed. The equations and algorithms assess a limited set of electrical characteristics. A more complete set of thermal, voltage, protection, and safety criteria are used to determine the maximum hosting capacity at any specific system point.

The streamlined methodology “may not capture some of the more dynamic effects on the more complex circuits,” the working group report noted. But, “the ability to utilize simpler equations and algorithms within a database can enable faster computations on large datasets.”

Contrasting the “streamlined” HCA, an “iterative” methodology “performs an iterative power flow simulation at each node on the distribution system,” the draft report explained. Its detailed system simulations derive a full analysis but require multiple iterations. Though the outcome offers a much higher level of accuracy, very long processing times are needed for systems with large numbers of distribution circuits.

“The use of an iterative simulation parallels what IOUs would perform as part of an interconnection study,” the draft report notes. But it would likely “provide more confidence in representation of integration capacity on more complex circuit conditions.”

While the working group paper outlines the two models, EPRI’s McGranaghan argued that the two methodologies should not be labeled as only “streamlined” or “iterative.”

“There are not simply an iterative and a streamlined methodology,” he said. “There is a spectrum of methodologies with different levels of accuracy that have different applicability depending on the goal.”

This will be especially true when applied to planning processes, McGranaghan said. “Forecasts are very difficult because there are so many unknowns. When they get the model of the physical distribution system right, one system at a time, they still need to know what the load will be and what new DER will be adopted and when.”

In a demonstration at Southern California Edison, the average time needed to complete the 576 hourly calculations to determine the HCA for a single, average size feeder using the streamlined methodology was approximately 2 minutes. The iterative methodology, by contrast, took approximately 23 minutes for the single feeder, but “produces the most accurate results that could be applied more seamlessly in the interconnection process,” according to the utility’s report.

The EPRI DRIVE methodology selected by the New York utilities uses a different methodology and takes 3 minutes to 5 minutes for an average feeder, McGranaghan said.

“We have to figure out a methodology that is scalable, replicable, and compatible with utility systems,” he said. “There will be methodologies on the spectrum that will work but it won’t be as simple as just California’s streamlined and iterative methodologies.”

The utilities’ initial roll-out of the online DER maps and data is likely to be in the next year to 18 months and will probably be updated monthly, CalSEIA’s Heavner said. “The closer the data updates are to real-time, the more useful it will be for developers.”

“The ultimate but ambitious goal is a real-time hosting capacity analysis, but there are more things to figure out,” IREC's Stanfield said.

Complete results of the demonstrations, as reported by the utilities, were incorporated into the working group’s draft final report recommendations.

Preliminary utility findings

Picker's ruling requires demonstrations to be performed in at least two distinct Distribution Planning Areas (DPAs) and evaluated with each of the two methodologies, SCE reported.

SCE’s two demonstrations totaled “eight distribution substations and 82 distribution feeders serving a representative mixture of residential, commercial, industrial, and agricultural customers.”

Its top-level conclusion: “The characteristics of local distribution systems are significant factors which dictate the level of DERs that can be interconnected to the distribution grid without adversely affecting the critical distribution system components.”

Hosting capacity and the impact on the interconnection process will vary with physical and load characteristics, SCE’s Takayesu said. The HCA-derived data will “inform developers of areas where interconnection projects are more likely to pass Fast Track and where more detailed study may be needed.”

As required by Picker’s ruling, SCE evaluated both HCA methodologies. It used one day per month of typical light-load conditions for 12 months. The 576 hours of data produced the 2 minute per feeder average for the streamlined method and the 23 minute average for the iterative method.

The streamlined method achieves results more quickly than the iterative method but “the level of accuracy is highly dependent on the complexity of the distribution system, and, in some cases, yields sub-optimal ICA results that would require further study during interconnection,” SCE acknowledged.

SCE was looking for a balance between “accuracy of results and computational time requirements” that could be used in the near term and refined “for long-term applications,” it added.

Its proposed solution for the first online mapping and data roll-out is “a blended ICA method” that would apply the iterative method to a “typical 24-hour, light-load day” and the streamlined method to a 576-hour analysis. This would provide the “balance of computational accuracy and time” as well as “a solid baseline for the development of a more complex, long-term ICA analysis.”

SDG&E

SDG&E’s study of its demonstrations found the hosting capacities derived using the two methods “were within 30% of each other 75% of the time.” The report attributed the differences to technical limits in the streamlined method.

The important difference between the two methods was that the streamlined method “could be up to 50 times faster depending on how the iterative analysis is applied,” SDG&E reported. “For this reason, performing iterative analysis updates on a frequent basis is prohibitive." Its recommendation is an annual update for interconnections analysis.

SDG&E also included a comparison for each utility’s analysis of the hosting capacity on a common test circuit. “There was not a significant difference, in either integration capacity value or computer time,” it reported.

The demonstrations were “a good launching pad” for the tools and methods needed to do HCA because they revealed “challenges to be overcome,” SDG&E reported.

“There is general consensus on using an iterative simulation-based approach for the hosting capacity analysis, due to its more accurate results that can be used in interconnection review,” SDG&E’s Albrecht said. But it will require “vendor collaboration to improve and streamline the process.”

Only the iterative method is accurate enough for interconnection purposes, despite its long “run times” and the “daunting” size of the data files it produced, the utility reported. But, because the streamlined method is more workable for the “what if” scenario analysis used by planners, a blended methodology is likely to be the “optimal approach,” SDG&E concluded.

PG&E

The Pacific Gas and Electric (PG&E) report on its demonstrations also supported using both methodologies “as appropriate” because each “is better suited for specific applications.”

Results from the utility’s demonstration project “determined that more research and development is needed to produce robust automated algorithms to accurately determine hosting capacity through iterative approaches,” said Mark Esguerra, director of integrated grid planning. Of the three IOUs, PG&E raised the most questions about how the ICA is impacting its IT system. The utility called for greater “cooperation with software vendors” to “optimize and enhance” data management to meet HCA needs.

A blended methodology can be “on the spectrum of methodologies at a place where it has the right combination of computational efficiency and effectiveness, EPRI’s Smith said.

Toward HCA consensus

For calculating the HCA needed to make the processing of interconnection applications more timely, “a majority of the working recommended the iterative methodology,” according to the draft final report.

In fact, “the non-utility parties solidly agree and the utilities mostly agree, though not finally,” CalSEIA’s Heavner said.

“We believe the ultimate methodology will provide the highest level of accuracy while balancing complexity,” SCE’s Takayesu said. “We are optimistic that we will be able to arrive at consensus on key issues.”

The IOUs are concerned that the amount of granular and complex data will overburden their resources. The non-utility participants want a methodology that provides adequate data for fast-tracking interconnections at available locations and avoiding burdensome impact studies.

The non-utility parties understand the utilities’ arguments about what the iterative method imposes, Heavner said. “But we are going through all the work so we want the methodology to be right. The judgment of whether it is worth the cost depends on how granular the data is that comes out of it.”

An emerging strategy is looking for computational efficiencies that can collapse the burden on utility resources, he added. “Also, there may be offsetting benefits from using the iterative method. If it significantly reduces the time for detailed interconnection analyses, that might offset the cost of the hosting capacity analysis.”

The working group draft report accepted in theory the utilities’ consensus recommendation to use the streamlined methodology for planning. But the many ongoing CPUC proceedings on system planning, utilities’ annual planning processes, and grid modernization will inform how the ICA and its companion locational analyses are applied, it noted.

Until those proceedings advance, the methodology to be used for planning should remain “a priority long-term refinement item,” the report concluded.

There were other important recommendations from the working report. The mapping and online data should be “updated frequently enough to allow for a meaningfully faster (if not ultimately ‘automatic’) interconnection process.”

While SDG&E found an annual update to be adequate, the non-utility stakeholders argued updates should be at least monthly. SCE accepted this. PG&E warned of the costs for that degree of processing.

To make the online maps and data usable, the values must match those used for interconnection reviews, the working group report added. The values should “provide monthly and hourly data about hosting capacity limitations that enables a developer to design a system that takes full advantage of the available hosting capacity at their proposed point of development,” it concluded.

After the ICA is finalized, PG&E will focus on planning tools to automate analyses and on developing internal processes to enable regularly updated ICA values, Esguerra said.

“The ICA is a complex and resource intensive effort and, ideally, we would want to automate this process as much as possible,” SCE’s Takayesu acknowledged.

Ultimately, he added, “more resources will be required even as we add more automation and computing power to support the iterative nature of this analysis.”

The distributed grid of the future

The next question is how the availability of the data will change the handling of DER, IREC's Stanfield said. “It remains to be seen how they will use the data in their distribution planning.”

It will take the utilities “a long time” to have the data and models needed for “perfect” representations of their systems, EPRI’s Smith said. “Operational flexibility is the next big issue.”

CalSEIA’s Heavner agreed. “It is the next big point of contention. Utilities do not want DER to limit in any way their ability to make operational decisions in emergency situations,” he said. “There is a less conservative but still transparent and efficient standard, but it is not yet clear what it is.”

A standard for operational flexibility that allows more DER but satisfies utilities will require “a more sophisticated system that gives utilities real-time visibility of DER,” he added. “We need smart inverters and other technologies to do that.”

SCE recognizes that smart inverters may “alleviate some DER-driven technical issues that create hosting capacity limitations,” Takayesu agreed. But “more work is needed to integrate smart inverter technology.”

Heavner also sees a debate coming about how to include the outcomes of the DER proceedings into the commission’s complex Rule 21 technical standards.

SDG&E’s Albrecht agreed. “While the parties are generally in agreement on using ICA within Rule 21, there are nuances between their positions on how exactly the ICA results are applied.”

EPRI’s McGranaghan is looking even farther into the future. The right methodology could move the discussion from simply an analysis to understanding how to actually increase a system’s hosting capacity at the lowest cost, he said.

“That will involve the full range of new technologies and be part of the decisions on grid modernization investments,” McGranaghan said.

“Ultimately, the goal is dynamic protections in an intelligent operational system that can deal with two-way power flows,” he added. “But that means new controls, sensors, and new distribution system operations.”

Filed Under: Distributed Energy Regulation & Policy Technology Corporate News
Top image credit: Flickr; Walmart