From 11:00 PM CDT Friday, Nov 8 - 2:30 PM CDT Saturday, Nov 9, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Controlling a Hardware-in-the-Loop Grid Simulator for the World’s Most Powerful Renewable Energy Test Facility

"This facility is unique because it has the ability to simulate and test the entire cyber-physical system, including the electromechanical machine of the wind turbine, the dynamics of the power grid, and the cyber control system software algorithms and how they interact."

- J. Curtiss Fox, Duke Energy eGRID Director, Clemson University Restoration Institute

The Challenge:

Providing energy companies and graduate students a state-of-the-art facility to test both the mechanical and electrical characteristics of a hardware innovation prototype for any energy resource on a utility scale (up to 15 MW) in a controlled and calibrated environment before deploying it on the actual grid.

The Solution:

Delivering high-speed deterministic DAQ, control, and communications for a 15 MVA hardware–in-the-loop (HIL) grid simulator using NI LabVIEW system design software and NI PXI, NI CompactRIO, and NI FlexRIO hardware.

Author(s):

J. Curtiss Fox - Duke Energy eGRID Director, Clemson University Restoration Institute
Mark McKinney - Clemson University Restoration Institute
Ben Gislason - Clemson University Restoration Institute

 

In 2009, the Department of Energy (DOE) awarded the Clemson University Restoration Institute (CURI) the largest DOE grant ever issued to build the world’s most powerful mechanical test facility for wind turbine nacelles up to 15 MW, which is three times larger than the largest nacelles in use today. These nacelles house the core generating components of a turbine and bear tremendous stress from the rotating blades; therefore, this equipment must be rigorously tested prior to deployment to minimize repair costs. The DOE is counting on wind power to provide 20 percent of US energy by 2030, and this facility supports this objective. 

 

As energy companies and graduate researchers work to advance rapid access to technology innovations, they must have access to state-of-the-art electrical testing capabilities, as well. Therefore, the DOE issued a separate grant to provide these capabilities for a 15 MW grid simulator in what is now the Duke Energy Electrical Grid Research Innovation and Development (eGRID) center. With the addition of this 15 MW eGRID, companies can test both the mechanical and electrical characteristics of hardware prototypes for any energy resource on a utility scale in a controlled and calibrated environment before deploying them on the actual grid. 

 

The eGRID offers advanced electrical testing of multi-megawatt devices. To achieve this, the grid simulator required the following:

  • Sophisticated communication systems capable of communicating between multiple protocols in multiple time domains
  • High-speed deterministic DAQ capable of providing timestamped high-resolution data for offline analysis and real-time data to ensure grid simulator control system stability
  • Easily reconfigurable and expandable systems for multiple testing scenarios
  • A complete human machine interface (HMI) for easy control of the entire system

 

The Grid Simulator

The 15 MVA grid simulator comprises a wide assortment of electrical components. We used NI hardware and FPGAs combined with LabVIEW system design software to integrate these components into a system capable of true HIL control. The deterministic nature of the FPGAs and the LabVIEW Real-Time Module allow for a flexible and reliable system for DAQ, communication, and control.

 

NI software and hardware met the requirements of the eGRID interface controller by offering a flexible and modular solution that handles multiple communication protocols and data types, high-speed DAQ, and synchronization within an industrial/laboratory environment. A simplified diagram of the grid simulator system is shown in Figure 2. The system has four main components: power amplifiers, a reactive divider network, a real-time simulator, and a control interface.

 

 

 

Power Amplifier

The power amplifier is essentially a three-phase 15 MVA arbitrary waveform generator capable of producing waveforms ranging from 45 Hz to 65 Hz with specified voltage amplitudes and prescribed relative phase angles and harmonic content. This amplifier acts as an actual power grid to which a component would be connected, but it has far greater control over the grid characteristics and behavior.

 

Reactive Divider

The reactive divider helps engineers create fault scenarios. It consists of a set of variable inductors and resistors placed both between the amplifier and the device being tested and as a shunt to ground at the point where the device under test (DUT) is connected to the system.

 

Real-Time Digital Simulator

Finally, to create a true HIL system, a real-time digital simulator (RTDS) is used to simulate realistic responses of a moderately sized power grid. Using the RTDS with the power amplifier, the system is capable of not just generating the predefined voltages of a power system but also responding realistically to the effects of having the DUT actually connected to the system.

 

Control Interface

At the heart of all of these components is the control interface, which controls every aspect of the grid simulator. Its functions fall into three main categories: data acquisition and logging, communication, and control. Many options were considered for implementing this vital component of the grid simulator, but the flexibility and modularity of NI hardware and software made NI the obvious choice. The deterministic nature of real-time and FPGA hardware coupled with the intuitive LabVIEW programming environment drastically reduced the development time of the entire system. The control interface is made up of NI PXI chassis populated with NI R Series multifunction reconfigurable I/O (RIO) FPGA modules, GPS-synced timing cards, NI FlexRIO FPGA modules, and an NI CompactRIO expansion chassis

 

System Control

The grid simulator system is composed of low- and high-speed DAQ and control. A connectivity diagram for the control interface is shown in Figure 3. The main operator interface is the control room HMI PC. This PC provides test setup and monitoring as well as control of the low-speed systems. Temperature monitoring and control are achieved from the HMI PC with a CompactRIO expansion chassis communicating over Ethernet. The reactive divider network is configured with LabVIEW through Schweitzer Engineering Laboratories relay modules using DNP3 protocols. 

 

 

The NI PXI interface controller is responsible for nearly all of the required high-speed communications, including communication with the power amplifier, RTDS, and silicon-controlled rectifier switches that control the timing of induced faults.

 

The power amplifier, designed and built by TECO Westinghouse Motor Company, is controlled with a custom serial communication protocol over fiber. The controller transmits and receives a packet of voltage information for all three phases on a 12 kHz sync signal from the amplifier. Being able to update voltage levels every 83.3 µs with a nominal AC frequency of 60 Hz allows for exceptional control over the precision and harmonic content of the generated waveforms.

 

Instead of just producing a predefined voltage waveform, the power amplifier can create waveforms consistent with actual events found in real power grids. Essentially, the DUT cannot tell the difference between being connected to this virtual grid and an actual power grid.

 

The desired waveforms are either generated by LabVIEW code running on the interface controller or from the RTDS. The RTDS is specifically designed to simulate power grids in real time. Communication with the RTDS is accomplished through a custom fiber communication link using an NI FlexRIO FPGA module with an adapter module.

 

For an added layer of sophistication, the RTDS can take actual voltage and current readings from a DUT, thus serving as a true HIL system. These measurements are routed from the NI PXI DAQ device through the interface controller and to the RTDS. The RTDS then integrates these voltages and currents into the simulated grid and responds exactly as a real-world grid would.

 

Data Acquisition

To ensure the stability of the HIL system, low-latency DAQ independent of channel count was needed. The voltage and current measurements from various points throughout the system were initially supposed to be acquired with a traditional DAQ device and propagated to the PXI chassis using reflective memory. However, this solution could not provide the low latency required to transport the voltage and current values through the control system to control the power amplifier without risking instability in the system. We determined the latency was caused by the addition of a few small time delays throughout the system, and only small performance gains were achieved until we changed the system architecture.

 

We implemented an alternative approach using FPGA DAQ and a custom fiber communication protocol based on the protocol used by the power amplifier. The direct FPGA-to-FPGA communication system used a modified RS232 protocol operating at up to 40 Mbit/s across plastic optical fiber. Using the NI PXI-7842R R Series multifunction RIO module allowed for eight channels of DAQ per card at up to 200 kS/s per channel. The sampled data was also transferred to the NI PXI DAQ device using a DMA first in, first out (FIFO) process for recording on the data logger at a rate that can be adjusted independently of the sampling rate of the fiber-optic communication interface.  

 

Additionally, all three of the NI PXI-7842R modules were synchronized via PXI triggers with the NI PXI-6682H timing module, which recorded the timestamp for all 24 analog input channels. The NI PXI controller received the low-latency data transmitted by the FPGAs in the three NI PXI-7482R modules on an NI FlexRIO PXI Express FPGA module, which had 16 fiber-optic channels. Similarly, controls information from the NI FlexRIO PXI Express module was transmitted to the NI PXI interface controller via DMA FIFOs and was timestamped via PXI triggers from an NI PXI-6682H timing module in the same chassis. 

 

The NI PXI-6682H timing modules in both the NI PXI DAQ and NI PXI interface controller modules were synchronized via IEEE 1588 to a third NI PXI-6682H timing module in the NI PXI data logger that hosted the IEEE 1588 grandmaster clock, which was synchronized to GPS time. The data captured from both the NI PXI DAQ and NI PXI interface controller modules was streamed via Ethernet using network streams to the data logger for recording. The data logger then streamed the data to a 24 TB RAID array through a MXI-Express module in the Technical Data Management Streaming format. This stored data was then accessible from the control room PC using network streams.

 

 

 

This facility is unique because it has the ability to simulate and test the entire cyber-physical system, including the electromechanical machine of the wind turbine, the dynamics of the power grid, and the cyber control system software algorithms and how they interact. Additionally, the biggest advantage of this solution is that a relatively small development group with minimal FPGA or HDL experience, but with a moderate amount of LabVIEW programming experience, was able to quickly develop a powerful FPGA solution using the LabVIEW FPGA Module.

 

Author Information:

J. Curtiss Fox
Duke Energy eGRID Director, Clemson University Restoration Institute

Figure 2. Simplified Diagram of the eGRID Grid Simulator
Figure 3. Connectivity Diagram for the Control Interface
Figure 4. DUKE eGRID Simulator