Time-interval error (TIE) is defined as the short-term variations of the significant instants of a digital signal from their ideal positions in time. The methodologies on how to compute crest factors for TIE discussed in this application note fall into two groups: methodologies for (1) telecommunications (i.e. “telecom”) applications, and (2) non-telecom applications. Telecom applications generally revolve around a narrow group of industry standards including SONET, SDH, and OTN. These standards quantify total jitter as RMS and peak-peak values based on analog measurements taken within a 60-second time interval. Non-telecom applications (are assumed here to) include everything else, and are associated with a wide variety of industry standards (e.g. Fibre-channel, PCI Express, Ethernet, etc.). These standards decompose total jitter into random and deterministic components to estimate total jitter at a low target bit-error ratio. This document addresses telecom applications. Refer to application note AN10070 Computing TIE Crest Factors for Non-telecom Applications for a discussion of non-telecom applications.