A few weeks ago, I had the honor and privilege of being one of a few invited attendees at the DOE Mission Innovation Workshop on Grid Modernization. The workshop was hosted by the University of Pittsburgh and held at the Energy Innovation Center. Attendees included leaders from the Department of Energy, Pittsburgh city government officials, community and foundation organizations, and representatives from key local industries — including major utilities, electrical system integrators, electrical system manufacturers and technology companies (like ANSYS).
Pittsburgh, and other similar cities, face significant energy and sustainability challenges over the next few years. These challenges stem primarily from the significant disparity in the goals that have been set — as can be seen in the SmartPGH video — and the current state of the grid and industrial equipment.
These challenges exist both in terms of:
- replacing and updating aging infrastructure and technology (for example, the need to install smart meters capable of net metering, development of microgrids for higher reliability and security, etc.) and,
- in building up a workforce that can develop and implement this new technology.
Regular readers of this blog — and other simulation aficionados — will of course be very familiar with the role of simulation in designing more efficient and smarter products. For example, ANSYS Simplorer could be used to study the benefits and effects of replacing older Silicon-based switching devices with more efficient Wide Bandgap devices.
In addition, I believe these challenges highlight another key role for simulation software, that of building a digital twin. The digital twin is the end result of a successful simulation driven product development cycle — also known as a complete virtual prototype. This digital twin can then be studied and used as a proxy for the real system to train the workforce and to identify and fix operational issues.
As an illustration, here is a quick exercise that I tried. Let’s say that a utility company in Pennsylvania is considering installing a utility grade solar setup and wants to determine how effective it would be, based on historical solar data — available rather abundantly from government agencies.
To start this experiment, we quickly built up a large solar panel array in Simplorer using existing components.
We then chained several of these together to build a large source. Then using the dataset tool, we imported the hourly radiation data for the last available year. Finally, we simulated various configurations, as well as panels from different manufactures and to determine the expected output power for this system. The utility company could use this simulation to study various configurations and manufacturers for their installation.
Once this is done, the final Simplorer design could then be used as an online diagnostic tool. Using a radiation sensor (such as a pyrheliometer), the local solar radiation could be measured and fed in to the simulation. The feeding-in is implemented by writing a custom CModel in Simplorer that listens to a port where the sensor sends it data.
The simulation could then be used to relate observed power (as measured on the installation with a Wattmeter) with expected power (based on simulation). A significant deviation would trigger a detailed evaluation of the system in Simplorer. When augmented with actual sensor data, we could also predict the location of the problem. In the simulation below, the significantly lower power output can easily be diagnosed as being due to a failed electrical connection that disconnects a large section of panels.
Clearly, simulation, combined with data sensing techniques, could provide enormous benefits both in designing and operating products that are required to meet our energy challenges.