Reference Level and Pulse Detection Calibration
- Updated2024-10-21
- 2 minute(s) read
Reference Level and Pulse Detection Calibration
While the RTG is capable of providing simulated targets with varying power levels, it is designed assuming the input radar pulse has a consistent power level. With this known fixed power level an input reference level can be chosen that optimizes dynamic range. It is recommended that the radar signal levels be adjusted or attenuated to a ±10 dBm range. This power level not only protects the instrument from damage but also provides for the best signal quality in the acquisition processing. The ideal reference level should be approximately 3 dB higher than the peak expected radar power. Note that variations in input radar power will result in variations in target output power as the RTG is designed to apply a calibrated amount of attenuation instead of a calibrated amount of power.
The optimization of the reference level can affect pulse detection capabilities. The pulse detection algorithm is monitoring for changes in power level as a way to detect pulse presence. The rising and falling thresholds of the pulse detection algorithm default to values calibrated when there is a near optimal reference level resulting in good dynamic range in the acquired signal. If the acquired signal is too low relative to reference level, the radar power might not be enough for the pulse detection algorithm. When this is the case, running the pulse detection calibration can readjust those thresholds so that they work with the provided signal. The pulse detection calibration algorithm needs to be manually run because calibration should only occur while a known representative signal is being acquired by the RTG. This calibration is quick and can be performed at any time and repeated as necessary.