System Self Calibration
- Updated2024-10-21
- 4 minute(s) read
System Self Calibration
The RTG performs self-calibration to allow it to provide accurate delays and attenuations as well as to provide optimized spectral performance. This calibration is only valid for a given VST center frequency, input reference level, and the data path (routing between the VST and coprocessor). If any of these parameters are changed, the current calibration will be inaccurate and will need to be re-run. Running system self-calibration—or using existing saved calibration data—must be performed every time an RTG session is restarted to achieve the best accuracy. Saving and using stored calibration data does not require cabling changes and saves time.
The different parts of system self-calibration are run back-to-back. Each section is described as follows.
- Static scenarios—Update the offset frequency value either in the UI or through the API. This frequency will be used to apply corrections to all configurations.
- Dynamic scenarios—Configure the RTG with an enable frequency correction option. When set to TRUE, the RTG will measure the frequency of the radar pulse and apply the appropriate corrections. When set to FALSE, the RTG will use the value entered in the offset frequency parameter to apply the appropriate corrections. The only caveat is that if the desired target delay is less than the amount of time required to measure the frequency and calculate the compensation, the user-entered offset frequency will used in place of the measured frequency. Refer to Attenuation for more information about on-the-fly (OTF) correction.
Related Information
- Attenuation
The RTG has several sources of attenuation.