In the 1990s, utilities and their regulators in a number of states across the country settled on retail rate net metering as a simple and convenient way to encourage the growth of distributed solar.
It seemed like a good idea at the time — consumers would earn credits equal to the retail rate of electricity for any extra energy they exported back to the grid from their rooftop solar systems. For years, the practice allowed customers with rooftop solar to significantly cut their utility bills, enhanced the value proposition of rooftop solar, and allowed reneawble energy to grow without a significant impact on utility finances.
But in the past few years, the situation has changed. Since the Great Recession, most utilities across the nation have faced stagnant or declining load growth as consumers cut energy usage and invest in efficiency upgrades. At the same time, increasing numbers of utility customers are turning to rooftop solar, and their numbers have become large enough in some service areas to significantly impact the utility's bottom line.
Now, solar advocates and policymakers across the nation are passionately debating whether the “rough justice” of a retail rate net metering credit should still apply to today’s evolving distributed energy resources (DERs) market, and what more accurate valuation scheme for DERs would be. A new report says it could take several years of advanced calculating to get the answer.
The rise of VOS
A key question in the debates is “proper value" of rooftop solar, according to North Carolina Clean Energy Technology Center Policy Analyst Benjamin Inskeep, co-author of a quarterly catalogue of U.S. solar policy decisions.
“If a state isn’t proposing a value of solar study or something similar to account for the costs and benefits, it is probably studying it," Inskeep said. "If not, it is a matter of when, not if."
Value of solar (VOS) studies date to at least 2006, according to Pace Energy and Climate Center Executive Director Karl Rabago who, as an Austin Energy executive, led one of the earliest VOS implementations.
Refined calculations now make VOS a “forward-looking approach that recognizes the ad-hoc use of the retail rate for NEM was rough justice and [further] analysis is appropriate,” Rabago explained.
Utility regulators largely appreciate the VOS approach because it allows regulators "to escape being caught between utilities and solar advocates,” he said.
Even so, solar valuation questions remain hotly debated across the nation. Last month, when California regulators upheld retail rate net metering, they did so while adding the caveat that “a better understanding of the impact of customer-sited distributed resources on the electric system will be developed” by 2019, when commissioners will revisit the net metering issue.
The ruling also orders the commission staff to begin working on new valuation methodologies and further analysis on alternative compensation structures, including a rate “that takes into account locational and time-differentiated values of customer-sited DG.”
The commission "is not comfortable with the variety of value and cost estimates for this resource,” Commissioner Carla Peterman said at the close of the NEM 2.0 proceeding. “We want to get to the right number, we are troubled by the wide range of values, and we are saying in this decision that we need to get to a number by 2019.”
Location, location, location
To get to the number California regulators want and allow solar energy to stand alone in the marketplace, “location-based value calculations are essential,” according a recently released report from GTM Research.
Early efforts in New York’s Reforming the Energy Vision proceeding and California’s Distribution Resource Planning (DRP) proceeding show that determining location-based value is dauntingly complex, the study reported. But new efforts in Washington, Arizona, Massachusetts, and other states show the search is expanding for accurate cost-benefit numbers to resolve the animus of value-of-solar proceedings.
“The development of an overarching, long-term algorithm for the value of distributed resources will be a foundational tool for utilities,” according Director of Grid Research Ben Kellison, the GTM Research study's author.
An algorithm that allows utilities and third party providers to fully monetize distributed energy resources (DERs) “will accelerate the deployment of DERs, while finally directing these resources to the optimal locations to provide value to asset owners as well as the network,” the study concluded.
The algorithm will also support utilities’ DER procurements by allowing them to select the right combination of “low-cost, high-value grid upgrades that empower and leverage customer and utility investments to increase system stability, reliability, power quality, grid efficiency, and reduce carbon intensity,” Kellison said.
One of the keys to determining locational value are "fine-grained and dynamic" maps of utilities' distribution grids "to differentiate with much greater precision between costs at different sub-nodal locations," Rabago said.
The utilities “are only beginning to understand that a solar panel on my roof may be less valuable than a solar panel the next street over,” he explained. “When they understand that, they will be able to make more precise valuations of distributed resources to them.”
A vivid example is Consolidated Edison’s Brooklyn Queens Demand Management (BQDM) Program, Rabago said. The utility decided a $250 million investment in distributed resources like energy efficiency, storage, and demand response capabilities could defer a $1 billion investment in new distribution system infrastructure.
New software & algorithm requirements
The goal for an algorithm design would be, Kellison said, “a single process in which utilities could compare traditional grid investments with DERs and understand the value, the benefits and costs, and make planning decisions, and present those decisions to regulators for their evaluations."
The algorithm would also potentially provide the same information to third parties to guide investments to neighborhoods and substations where development is needed and away from locations where it would create expenses for the utility, he added.
Some of the questions the algorithm would answer include a resource’s value for energy, capacity and ancillary services, the transmission and distribution system needs for it, and the engineering to connect it to the network, Kellison said.
To address utilities' objections to VOS proceedings, the algorithm would also have to be based on agreed-on quantifications, he added. The debates about the transmission and distribution (T&D) system losses component of VOS calculations are one aspect.
“T&D losses across a utility’s system are a certain percent overall, but those losses are not the same in every part of the territory,” he said. “VOS calculations, as they are done today, don’t provide the incremental value of specific locations.”
As DER penetrations increase, those differences will become increasingly important, Kellison said. To get to those specifics, four areas need software development: Geographical information systems (GIS), medium- to long-term load forecasting, distribution system planning, and a customer data hub.
“Before you get into algorithms around power flows you need to understand things like what type of transformers are out there, how big the cabling is, where the transformers are, how many there are, and which customers are attached to which secondary transformers,” Kellison said. “Getting into that more granular information is necessary to have an ability to understand where the resource needs to be. That is what the GIS software will provide.”
It will also give DER developers and utilities an indication of which customers to approach with new product offerings, he added.
A new focus for utilities
“Most utilities have not looked at load forecasting from a locational perspective but only from a peak demand perspective,” Kellison said.
Instead, they have focused on overall growth, expected GDP changes, demographic shifts, and changes in business activity. With the new focus on locational value, software must incorporate things like what economic factors will drive more solar, more EVs, more storage, and where they will be on the system.
For distribution system planning software, utilities will need characterize their resources and their load profiles down to the feeders, the segments of feeder, and the customers.
“They will have to work it all the way up and down and understand how that will change peak demand to make optimal distribution planning decisions,” Kellison said. “Rate design can be built off this.”
Some software for these purposes has been built by the Sacramento Municipal Utility District (SMUD) and AEP, often in partnership with market leaders like Cyme or Synergy, Kellison said.
But for the location-based valuation now needed, that software has not sufficiently incorporated GIS, is too simplified to use with two-way power flows, or has not connected the distribution system and transmission system, he said. “They are taking on pieces of a complex overall upgrade that is not easy to do and so far they have only gotten parts.”
The California IOUs, driven by their regulators, are closest to a full working software program, and they have all worked with Cyme and DNV GL’s Synergy, Kellison said.
The idea of a customer data hub is the most nebulous piece because it involves political and regulatory challenges in addition to software design complexities.
Utilities would house information in the hub that would indicate the best locations for DERs and, through it, verified third parties could access that information, Kellison explained. But, because it would necessarily include so much information about customers, there is a question of the role of regulated utilities and regulators in building it.
“What information is in the hub? How do the parties interact? How granular is the data?” Kellison asked. “It is by far most nebulous piece and most outside utility control.”
The most advanced models
California’s DRP is leading the “paradigm shift in the distribution planning and procurement process,” the study reported.
“Distribution planning should start with a comprehensive, scenario driven, multi stakeholder planning process that standardizes data and methodologies to address locational benefits and costs of distributed resources,” reads the foremost principle informing the California commission’s Order Instituting Rulemaking for its DRP proceeding.
This is “a key component in the design of future utility business models to reduce utility costs and improve service,” the GTM Research study reports.
“From a distribution planning perspective, it most definitely allowed for thinking with tools we haven’t normally used,” Pacific Gas and Electric (PG&E) Principal Engineer Mark Esguerra told Utility Dive. “Instead of looking strictly at wires solutions, we started looking at DERs as non-wires solutions and at what the most cost-effective way of using them is to meet our reliability, safety, and affordability standards.”
PG&E did dynamic analyses for “all its distribution feeders down to the line section level,” its filing reported, in addition to analyzing "more than 500,000 nodes and 102,000 line sections across 3,000+ feeders to provide the locational capacity for multiple DERs.”
PG&E "learned a lot about [its] distribution system,” Esguerra said.
Southern California Edison (SCE) and San Diego Gas and Electric opted to use the CPUC-approved Distributed Energy Resource Avoided Cost (DERAC) Calculator developed by Energy+Environmental Economics for their initial DRP filings. It generalizes data from representative feeders and was enhanced to include a comprehensive list of location-specific benefits and avoided costs. Both utilities are now working on more granular data.
“A representative feeder has characteristics similar to other distribution circuits in factors like resistance, voltage, load, and whether it is overhead or underground,” said SCE Director of Electrical System Planning Erik Takayesu. “It is a way to model an entire system without modeling every single circuit.”
Going forward, SCE will have an online tool that will, using the results of the study of the representative feeders, extrapolate the capabilities of the other 4,600-plus circuits in SCE's service territory, Takayesu explained. “In the next DRP, using automated analytical processes, we intend to perform this analysis across all our distribution circuits," he said.
An evolving algorithm
“The eventual approval of an algorithm will act as a key input to propose new projects,” the GTM Research study said, while also “accent[ing] technical interconnection analysis limitations to guide investment to the optimal locations in the network.”
The algorithm is at least two years to five years away, Kellison said. “It will take a lot of effort and even getting the values agreed on in test bed areas is some time off.”
With DERs still at very low penetrations overall, there is little urgency in most places, he added, though already places like Brooklyn present opportunities to avoid or defer large capital expenditures.
The final California DRP filings are due in 2017, and the pilots for the concepts won’t be in place until 2020. Even so, an algorithm that comes out of that process could proliferate.
“Much of the effort around developing those core systems will be applicable to other states,” Kellison said. “It depends on how particular regulators in particular states choose to go on utility planning and procurement.”
The impact on the private sector of more location-based valuations is not yet clear to Kellison. “Resources with similar output profiles will have a declining value no matter what,” he said. “Adding too much solar into one particular area will lower the rate of return.”
But location-based valuation will also open up private sector opportunities by revealing areas where DERs should replace traditional upgrades.
“Maybe instead of a larger cable and two bigger transformers, a utility can contract with a third party to put in energy storage at half the cost and get additional services.”