Synchronization in telecommunications

From Wikipedia, the free encyclopedia
  (Redirected from Synchronous network)
Jump to navigation Jump to search

Many services running on modern digital telecommunications networks require accurate synchronization for correct operation. For example, if telephone exchanges are not synchronized, then bit slips will occur and degrade performance. Telecommunication networks rely on the use of highly accurate primary reference clocks which are distributed network-wide using synchronization links and synchronization supply units.

Ideally, clocks in a telecommunications network are synchronous, controlled to run at identical rates, or at the same mean rate with a fixed relative phase displacement, within a specified limited range. However, they may be mesochronous in practice. In common usage, mesochronous networks are often described as synchronous.


Primary reference clock (PRC)[edit]

Modern telecommunications networks use highly accurate primary master clocks that must meet the international standards requirement for long term frequency accuracy better than 1 part in 1011.[1] To get this performance, atomic clocks or GPS disciplined oscillators are normally used.

Synchronization supply unit[edit]

Synchronization supply units (SSU) are used to ensure reliable synchronisation distribution. They have a number of key functions:

  1. They filter the synchronisation signal they receive to remove the higher frequency phase noise.
  2. They provide distribution by providing a scalable number of outputs to synchronise other local equipment.
  3. They provide a capability to carry on producing a high quality output even when their input reference is lost, this is referred to as holdover mode.

Quality metrics[edit]

In telecoms networks two key parameters are used for measurement of synchronisation performance. These parameters are defined by the International Telecommunication Union in its recommendation G.811, by European Telecommunications Standards Institute in its standard EN 300 462-1-1, by the ANSI Synchronization Interface Standard T1.101 defines profiles for clock accuracy at each stratum level, and by Telecordia/Bellcore standards GR-253[2] and GR-1244.[3]

  • Maximum time interval error (MTIE) is a measure of the worst case phase variation of a signal with respect to a perfect signal over a given period of time.
  • Time deviation (TDEV) is a statistical analysis of the phase stability of a signal over a given period of time.

See also[edit]


  • Public Domain This article incorporates public domain material from the General Services Administration document: "Federal Standard 1037C". (in support of MIL-STD-188)
  • Bregni, Stefano (2002). Synchronization of Digital Telecommunications Networks. Wiley. ISBN 0-471-61550-1.