Today's incredibly complex products require an ever-growing amount of test. Traditional test instruments aren't proportionally scaling to keep up. Each subsequent generation of technology includes more pieces and more sophisticated technology and the cost of using traditional instrumentation to test these devices is only increasing.
One way to minimize hardware costs and reduce test time has always been to use virtual (software) instruments and modular I/O. But a new approach—software-designed instrumentation—is giving engineers the ability to achieve test time reductions that are orders of magnitude beyond what was previously possible unless you used custom chip designs and extensive design investment.
Read on to learn how customizing your test instrumentation through the user-programmable FPGA in software-designed instrumentation can enhance your test systems.
For years, test engineers have used software such as LabVIEW—instead of the fixed software in traditional boxed instruments—to customize measurement systems and reduce cost. This approach provides flexibility and takes advantage of the latest PC and CPU technologies, but the CPU is still a bottleneck in many demanding test applications.
CPUs inherently limit parallelism, and typical software stacks result in latencies that reduce test system performance in cases where measurements need to be adjusted dynamically based on values or device under test (DUT) state. To dramatically reduce test times, you need to combine custom digital logic with multicore CPU technology to give your test system a balance of low latency and high throughput.
While off-the-shelf instrumentation hardware traditionally has fixed capability, NI is leading the way in more open, flexible measurement devices based on FPGA technology. In summary, FPGAs are high-density digital chips that you can customize to directly incorporate custom signal processing and control algorithms into measurement hardware. The result is off-the-shelf hardware that has the best of both worlds—fixed, high-quality measurement technology, the latest digital bus integration, and user-customizable logic that is highly parallel, provides low latency, and is tied directly to I/O for inline processing and tight control loops.
An example of this type of device is NI's VST vector signal transceiver, which combines the functionality of a vector signal generator (VSG) with that of a vector signal analyzer (VSA), and also contains a user-programmable FPGA for real-time signal processing and control. With the added flexibility of an FPGA, the vector signal transceiver is ideal for channelization, channel emulation, DUT control, power amplifier characterization, instantaneous measurements, and more.
FPGAs continue to gain design wins and market share from application-specific standard products (ASSPs) and application-specific integrated circuits (ASICs) because they keep up with Moore’s Law better than other devices and dramatically lower development costs, resulting in smaller test system size and lower power consumption. Very capable FPGAs are entering the market and defining the hardware capabilities of many devices, but the IP they contain is vendor defined and the FPGA’s power may not be accessible to you. This is largely because the specialized hardware description language (HDL) knowledge needed to program these devices requires a steep learning curve and is generally restricted to digital design experts.
LabVIEW makes the latest FPGA technology accessible to a range of engineers and scientists. Using graphical programming, you can implement logic to define the behavior of an instrument in hardware and reprogram the instrument when requirements change. The graphical dataflow nature of LabVIEW is well suited for implementing and visualizing the type of parallel operations that can be implemented in digital hardware.
Several LabVIEW FPGA reference architectures are available as starting points for your test applications and can be used with devices such as the PXIe-5170R oscilloscope, PXIe-5668R RF VSA, and the PXIe-6591 high-speed serial instrument. For example, you can customize the FPGA according to an instrument data movement model (with customizable start, stop, and reference triggers) or according to a streaming model (ideal for inline signal processing or record and playback applications).
User-programmable FPGAs in your measurement system hardware provide benefits ranging from low-latency DUT control to CPU load reduction. The following sections describe various usage scenarios in more detail.
In many test systems, the device or chip under control must be controlled via digital signals. Traditional automated test systems can sequence through DUT modes and take the needed measurements in each stage. In some cases, automated test equipment (ATE) systems incorporate intelligence to progress between DUT settings according to the measurement values received.
In either scenario, software-designed instruments that incorporate a user-programmable FPGA result in cost and time savings. Consolidating measurement processing and digital control into a single instrument reduces the need for additional digital I/O in the system and avoids configuring triggering between instruments. In cases where the DUT must be controlled in response to measurement data received, software-designed instrumentation closes the loop in hardware and reduces the need for decision making in software at a significantly higher latency.
Today’s software-based test systems can perform a limited number of measurements in parallel, but software-designed instrumentation is limited only by the available FPGA logic. You can process dozens of measurements or data channels with true hardware parallelism, removing the need to choose between measurements of interest. With software-designed instruments, functionality such as real-time spectral masking is achieved with dramatically higher performance and at a fraction of the cost compared to traditional boxed instruments.
The low latency associated with performing measurements in software-designed instruments means that tens or hundreds of live measurements can be taken and averaged together in the same amount of time a traditional test system requires to take a single measurement. This results in improved test-result quality and increased confidence in your measurements.
Traditional instrument options for low-latency trigger behavior are fixed according to the hardware being used, but with software-designed instrumentation you can incorporate custom triggering functionality into your device to quickly zero in on situations of interest. Flexible hardware-based triggering means that you can implement custom spectral masks or other complex conditions as criteria for either capturing important measurement data or activating additional instrumentation equipment.
Processing large amounts of data can tax even the most capable commercial CPUs, resulting in systems with multiple processors or extended test times. With software-designed instrumentation you can preprocess data in the hardware, potentially reducing the CPU load significantly. Computations such as fast Fourier transforms (FFTs), filtering, digital downconversion, and channelization are implemented in hardware, reducing the amount of data passed to and processed by the CPU.
Engineers and scientists traditionally use instrumentation just for test and measurement applications, but the connectivity between I/O and software in modular instruments enables you to prototype electronic systems using instrumentation. As an example, engineers can prototype advanced radar systems using digitizers and RF signal analyzers; the connectivity to a user-programmable FPGA allows quicker advanced algorithm deployment in prototype and faster proof-of-concept validation.
Modern data communication protocols have transitioned from parallel interfaces to high-speed serial interfaces such as PCI Express, HDMI, DisplayPort video standards, IEEE 1394b, and USB 3.0. For design and test engineers, validating these interfaces presents new challenges and requires new test hardware. Traditionally, engineers employ very expensive oscilloscopes or bit error rate testers to characterize physical interfaces, and protocol-specific analyzers and generators to validate correct protocol stack implementation and efficient data transmission and reception. User-programmable FPGAs offer an intriguing solution to these challenges. Modern, high-performance FPGAs generally include several multigigabit transceivers that support a variety of high-speed serial interfaces. Combined with the appropriate protocol-specific IP, graphical LabVIEW FPGA programming, and the advantages of the PXI ecosystem, a new type of instrument emerges—a high-speed serial, software-designed instrument.
Although this paper focuses primarily on test, engineers are increasingly reusing IP between the design and test stages to considerably reduce time to market and overall test expense. With LabVIEW FPGA, digital signal processing algorithms and digital protocols can be defined during product research and development and then reused as part of device or component verification—eliminating the need to generate test code from scratch.
Vendor-defined instruments and fixed-capability, off-the-shelf instruments will remain available for years to come, but increasingly complex devices and time-to-market pressure will lead to the rise of software-based instrumentation systems. The continuation of device complexity and time pressure means that software-designed instruments will play an increasingly important role in test instrumentation—starting right now.
Software-designed instrumentation provides the highest level of flexibility, performance, and future-proofing currently possible with off-the-shelf hardware. As your system requirements change, software-designed instruments will preserve your software investment across different pieces of modular I/O and also ensure your existing I/O can be modified according to the application at hand.
With a software-designed instrument, you are only limited by your imagination. Watch this video for a glimpe of the possibilities.