Why timing and synchronization has evolved in telecom networks

Telecommunication networks are fundamentally about sending and receiving signals accurately. In order for the network to accurately juggle the complex transmission and reception of data from many devices in various locations, accurate timing and synchronization is absolutely crucial. When timing and sync is off, network performance and user experience can collapse. 

“Timing alignment is fundamental in telecom,” said Kishan Shenoi, timing and synchronization architect for Intel, during a session at ATIS’ recent Workshop on Synchronization and Timing Systems (WSTS). 

In early wireless networks, Shenoi said, synchronization was often referred to under the umbrella of frequency or time “accuracy”, with a focus on the impact on handovers. After all, the ability to move and maintain a connection is one of the defining and most important characteristic of a cellular network. Base stations have to be both synchronized relative to moving user equipment, accounting for Doppler shift, as well as coordinate with their neighbor base stations in order to facilitate smooth handoffs. Cellular networks also require synchronization to an external or absolute source of time across all network nodes, in order to avoid interference and to be able to successfully leverage the advanced RF features mentioned earlier.

This was all a lot simpler in previous generations of cellular technologies, when 1) Frequency Division Duplexing (FDD) was the primary mode in 3GPP systems, in which different frequencies are used for uplink and downlink communications, and 2) RF transmission and reception was less complex, relying on lower frequencies, single spectrum bands rather than carrier aggregation, and single-input-single-output or lower-order multiple-input-multiple-output antennas as opposed to modern massive MIMO systems. 

Per Lindgren, group CTO and head of sync at Net Insight, which provides a network-based timing distribution solution, said that in 5G midband TDD systems, the accurate synchronization of both time and phase is much more complex. The move to widespread deployment of TDD spectrum for 5G by most major carriers “was the big shift, and I don’t think people realized what a big shift that was,” he reflected.

As systems have migrated to TDD and 5G, with increasingly advanced radio features, the range for allowable timing alignment error in cellular systems has tightened over the years. In CDMA systems, base stations had to maintain a TAE less than 10 microseconds. In modern LTE and 5G systems, Shenoi explained, TDD requirements for TAE can be less than 3 microseconds, and for some services in 5G clusters, the timing alignment error must be less than 130 nanoseconds. Meanwhile, timing source servers have been more widely distributed throughout the network to compensate for things like the fact that most operators lease at least some portion of their transport network and therefore do not control the end-to-end network.

Carrier aggregation, geolocation, Multimedia Broadcast Multicast Services (MBMS, or eMBMS/5G Broadcast), Inter Cell Interference Coordination (ICIC or enhanced ICIC) are all examples of cellular network features or services which deal with signal processing and must by necessity be properly synchronized in order for them to function as they should.

“The degradation with lack of synchronization is dramatic,” Shenoi warned.

Open RAN introduces additional timing and synchronization complexity by way of the disaggregation of RAN functions across fronthaul and midhaul connections in addition to the typical backhaul. In fact, in O-RAN, synchronization is such a vital concept that it is considered a separate plane: There is a data plane, control plane and the synchronization plane or S-plane. “It has been elevated, it’s not just an afterthought any more,” Shenoi said. 

Comments are closed.