I recall my first assignments at the power company. I had to evaluate a new power system simulation software package. It touted that would put an end to the data management nightmare faced by power engineers.
The problem was this:
Utilities tended to keep separate copies of the network data for each simulation case they ran. For example, say a power system planning engineer wished to simulate the power flow on their transmission system for the planned projects five years into the future. The widespread practice was to copy the data from the last simulation they ran. It was probably from the year before. Then they would edit the data. Then they would run the study again. But one simulation wasn’t enough. They needed to perform many “what if” analyses. That is, they needed to simulate the loss of one or more major piece of equipment. They could copy the data from the base case data and make the various changes in each of the network data sets for each contingency. The result was the utility had to maintain many copies of the data. The work was compounded when the engineers found out that the network they were planning for, might be different. Maybe a project would be cancelled or delayed. They then had to go back to all the copies and make the adjustments.
This was a major data management problem. Think of all the places where users could make mistakes. These simulations were critical to the stability and reliability of the electric network. Mistakes could be costly and even deadly.
That is only part of the problem. Another set of engineers, in charge of system protection against short circuits had to perform a different simulation for the same network. The problem was that the data they needed was somewhat different than the data needed for power flow. This was true even though the short circuit analysis was to be performed on the exact same physical network. Like the power flow engineers, the protection engineers also had to run “what if” analyses. They kept a slightly separate set of data for their simulation. More copies, more chances for errors.
It didn’t stop there. Engineers and planners had to perform other simulations. Things like stability analysis, insulation coordination, state estimation and others. They each required a somewhat different form of the network data. More copies, even more chances for errors,
You would think this problem would be easy to fix. Just create a single data base of the network information and extract the right data for each of the simulations. That’s what the software I was evaluating was intended to do.
It failed to catch on. Why? Back then data management was still primitive. The data was all file based. Second, users were comfortable with the data in its current form and saw no real value in learning how to deal with a more complicated system.
The problem still exists today. The issue is one of abstraction. What that means is applications, such as power flow requires network simplification. Even though the network is the same physical network, each simulation requires slightly different abstraction. The result is that once an abstracted data set is formed of the network, users edit the abstracted version of the data instead of the original source of the data. For some, the original source of network data is the GIS. The more simulations, the more copies of slightly different abstractions of the same data.
The ArcGIS Platform to the Rescue.
The latest release of the ArcGIS platform at 10.6 comes with a new network management extension. It helps utilities keep their network data at a much higher level of reality. ArcGIS manages network assets that closely resemble the real physical assets themselves. Those assets have location. They are three dimensional. They vary with time. We have all heard the term digital twin. It is a digital replica of physical things, like electric assets. At ArcGIS 10.6, the GIS truly is a digital twin of the network.
A simulation simply requires access only to the data it needs. Editing is performed on the GIS, not in the simulation and not on the abstracted data. Since the platform is based entirely on web services, the data delivery mechanisms are simple and delivered in real time. Results can be delivered back to the GIS for visualization and further analysis.
Simulation software is provided by Esri’s vast network of partners.
This only solves one half of the problem. The other problem is the multiple representations of the network over time including the “what if” representations. At ArcGIS version 10.6 users can specify what network elements are to be included (or excluded) from visualization and analysis. Users make their edits on a single set of data. The system is smart enough to know which data to include for a simulation.
Data management happens in the GIS, not in each application. In addition, the state (the various projects) can happen at this level as well. This puts the GIS at the center of data management, state management and visualization. While the software package I evaluated back when I first worked at the utility had the right idea. But of course, it wasn’t a GIS. Now there only needs to be one place to edit and distribute your network information.
For more information on the ArcGIS Utility Network Management Extension, click here.