Synchronization in telecommunications

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In telecommunications, a synchronous network is a network in which clocks are controlled to run, ideally, at identical rates, or at the same mean rate with a fixed relative phase displacement, within a specified limited range.

Many services running on modern digital telecommunications networks require accurate synchronization for correct operation. For example, if switches do not operate with the same clock rates, then slips will occur and degrade performance. Telecommunication networks rely on the use of highly accurate primary reference clocks which are distributed network-wide using synchronization links and synchronization supply units. Ideally, the clocks are synchronous, but they may be mesochronous in practice. In common usage, mesochronous networks are often described as synchronous.

Components[edit]

Primary reference clock (PRC)[edit]

Modern telecommunications networks use highly accurate primary master clocks that must meet the international standards requirement for long term frequency accuracy better than 1 part in 1011.[citation needed] To get this performance, atomic clocks or GPS disciplined oscillators are normally used.

Synchronization supply unit[edit]

Synchronization supply units (SSU) are used to ensure reliable synchronisation distribution. They have a number of key functions:

  1. They filter the synchronisation signal they receive to remove the higher frequency phase noise,
  2. They provide distribution by providing a scalable number of outputs to synchronise other local equipment,
  3. They provide a capability to carry on producing a high quality output even when their input reference is lost, this is referred to as holdover mode.

Quality metrics[edit]

In telecoms networks two key parameters are used for measurement of synchronisation performance. These parameters are defined by the International Telecommunication Union in its recommendation G.811, by European Telecommunications Standards Institute in its standard EN 300 462-1-1, by the ANSI Synchronization Interface Standard T1.101 defines profiles for clock accuracy at each stratum level, and by Telecordia/Bellcore standards GR-253[1] and GR-1244.[2]

  • Maximum time interval error (MTIE) is a measure of the worst case phase variation of a signal with respect to a perfect signal over a given period of time.
  • Time deviation (TDEV) is a statistical analysis of the phase stability of a signal over a given period of time.

See also[edit]

References[edit]