NI 5160/5162 Calibration
- Updated2023-09-20
- 4 minute(s) read
NI 5160/5162 Calibration
Every measurement instrument performs within its specifications over some finite temperature range and time period. If the temperature changes exceed the temperature range specified or time exceeds the time period specified, and your application requires tight specifications, calibration is required.
For example, if the accuracy of a digitizer is specified as ±(1% of input + 10 mV), and you apply 5 V to the input, the error is:
1% of 5 V + 10 mV = 60 mV for temperature range 18-28 °C
This example demonstrates the traditional method of specifying accuracy. The problem with the traditional method is that in a system environment, temperature is not easily controlled. When a system is composed of multiple integrated instruments, the system is subject to temperature rise caused by inherent compromises in air circulation and other factors. Self-heating from surrounding equipment, uncontrolled manufacturing floor environment, and dirty fan filters are among these factors.
If the ambient temperature is outside of the 18-28 °C range, you may need to know exactly what the measurement accuracy is to compensate for this temperature variation. With the traditional method,the only way to get the specified accuracy outside of the 18-28 °C range is to externally calibrate the system at the desired temperature. However, an external calibration is time-consuming and expensive and is infrequently done, so the specified accuracy is rarely obtained. You can learn more about external calibration at ni.com/calibration. In the example, if the ambient temperature of the digitizer is 48 °C, assuming the Tempco (TC) error is specified as
TC = (0.1% of input +1 mV)/ °C (a typical number is 10% of accuracy/ °C)
The additional error is
20 °C x TC = ±(2% of input + 20 mV) or 120 mV
The total error is three times the specified error (180 mV in the example above, versus 60 mV if temperature effect is ignored) due to the 48 °C ambient temperature.
Self-Calibration
To eliminate errors caused by changing temperatures, NI-SCOPE provides a highly repeatable self-calibration function.
For the NI 5160/5162, self-calibration yields the following benefits:
- Corrects for DC gain and offset errors within the digitizer by comparison to a precision, high-stability internal voltage reference. This is done for all ranges, both input impedance paths (50 Ω and 1 MΩ), and all filter paths (enabled/disabled).
- Calibrates trigger level offset and gain.
- Calibrates trigger timing, as well as the time-to-digital conversion (TDC) circuitry to ensure accurate trigger timing and time-stamping.
- Calibrates gain, offset, and phase for interleaved ADC modes, reducing ADC interleaving errors in TIS modes.
- Calibrates the phase DAC used to adjust the phase of the sample clock.
- Takes approximately 2 minutes to complete.
When to Self-Calibrate
For optimum performance, use self-calibration in the following circumstances:
- When the digitizer is placed in a new system and has warmed up for at least 15 minutes
- Any time the temperature changes more than 3 °C from the previous self-calibration (refer to the device specifications for specific temperature ranges, as some specifications may have different temperature limits)
- 90 days after the previous self-calibration
The result is a product that yields full performance over its operating temperature range and two-year calibration cycle for DC accuracy, AC response, and trigger level/timing. When the two-year calibration interval expires, an external calibration is required to ensure performance that is within specification over the next year.
The NI 5160/5162 has a temperature sensor that monitors temperature variations and can be read through the NI-SCOPE "Device Temperature" attribute. The previous self-calibration time and date can also be read. Unless temperature variations are a serious problem, self-calibration is not recommended more than once per day.
Input Connections During Self-Calibration
The NI 5160/5162 internal circuitry is automatically isolated from the input during self-calibration. However, if high-voltage, high-frequency signals are present during self-calibration, the calibration results may be adversely affected or the calibration may fail with an error.
When possible, always disconnect the inputs.
Programming Flow
The following diagram shows the typical programming flow for self-calibration.
NI-SCOPE provides the Calibrate example, which you can find by using the shortcut at Start»All Programs»National Instruments»NI-SCOPE»Examples.
Summary of Calibration Options
A summary of the calibration options available and when to use them is shown in the following table.
Calibration | Impact | When | Notes |
---|---|---|---|
External calibration | Calibrate time drift of onboard reference.NI 5160/5162 (2 CH)—Calibrates the external analog trigger channel. | Once every two years | Calibrates and verifies to full specifications |
Self-calibration | Offset and gain Trigger level Trigger timing Interleaved ADC gain, offset, and phase Sample clock phase ADC | 90 days, or when temperature changes >3 °C | Ensures range to range matching Ensures trigger accuracy |
No calibration | None, within 2-year calibration cycle or if temperature stays within ±3 °C | High absolute accuracy not required outside of 3 °C | If self-calibration is not used, derate the absolute accuracy using the specified temperature coefficient (Tempco) |