This application note presents a new method to accurately measure jitter using a real-time oscilloscope when the level of jitter added to a signal from the measurement environment approaches or exceeds the signal's intrinsic jitter. This method builds on previous work that combined measurement and modeling data to eliminate false spurs in peak-to-peak jitter data. We focus here to eliminate random amplitude noise introduced by the test environment in RMS jitter data.
A test environment can add phase and amplitude noise to a signal under test (SUT). Phase noise modulates a signal's edges directly, whereas amplitude noise converts to phase error during the oscilloscope sampling process. Both cases increase the measured SUT jitter above its true value. Perhaps the dominant source of amplitude noise in a test environment is vertical (quantization) noise in an oscilloscope's sampling system. This can be optimized when setting up the oscilloscope, but is always present to some extent. Imperfections in the oscilloscope's interleaving architecture also add amplitude noise, which distorts the measured waveform.