AN10070 Computing TIE Crest Factors for Non-telecom Applications

Time-interval error (TIE) is defined as the short-term variations of the significant instants of a digital signal from their ideal positions in time. The methodologies discussed on how to compute crest factors for TIE in this application note fall into two groups: methodologies for (1) telecommunications (i.e. “telecom”) applications, and (2) non-telecom applications. Telecom applications generally revolve around a narrow group of industry standards including SONET, SDH, and OTN. These standards quantify total jitter as RMS and peak-peak values based on analog measurements taken within a 60-second time interval. Non-telecom applications (are assumed here to) include everything else, and are associated with a wide variety of industry standards (e.g. Fibre-channel, PCI Express, Ethernet, etc.). These standards decompose total jitter into random and deterministic components to estimate total jitter at a low target bit-error ratio (BER). This document addresses non-telecom applications. Refer to application note AN10071 Computing TIE Crest Factors for Non-telecom Applications for a discussion of non-telecom applications.

 
This document cannot be viewed on a mobile device or with increased screen zoom functions. Please download the document.