Fastest isn’t necessarily best: SRG puts three 5G networks to an apples-to-apples test
Speed, speed and more speed: So much of the consumer expectations, and industry testing and bragging, are focused on the new heights of speed that 5G can achieve. The question of which operator has the fastest 5G network is one which has recently been getting mixed answers from different testing companies.
Signals Research Group has been analyzing the three national carriers’ 5G networks as well, and in its newest report, it does crown a winner with the “best” network performance. But that winner (AT&T) isn’t actually the fastest network.
In its approach to testing, SRG has often conducted tests that focus on maximum throughput because it delves into Radio Access Network performance and the availability and impacts of specific features. This latest report, however, shifts to looking at how “understanding network performance and how it impacts the user experience,” the report related to the testing says.
In SRG’s testing, AT&T actually came out with the slowest network in testing designed to focus on throughput. But that was only one facet of the testing, which was conducted in December in the Dallas, Texas market.
“Speed is important, but it’s not the only criteria,” said Mike Thelander, CEO of SRG, adding, “When you start looking at typical user behavior — if it’s, say, web browsing or if it’s downloading a small amount of data and posting to Facebook; video — the amount of data that you’re sending is not necessarily challenging the network.” Instead, other factors such as latency and responsiveness in interactive applications become important. And even when it comes to examining latency, Thelander adds, there’s a difference in how the network handles one packet, versus larger chunks of data sent over and over and over again.
“It’s like you have a Ferrari and you want to measure how fast it goes — are you measuring 0-to-60 on an open track? Are you measuring peak performance over an entire lap, is is 0-to-10? All those same kind of things come into play,” he said.
What criteria are tested, and how they are tested, makes a big difference in whether test results can be compared across different markets and over time. One of the things that makes SRG’s approach in this testing unique is that it relied on ETSI’s TR 103 559, where the standards group lays out best practices for “robust network QoS benchmark testing and scoring.” That ETSI TR has been implemented into Rohde & Schwarz’s mobile network testing products and its post-processing analytics software, so SRG used R&S’ testing tools in order to get results that were in line with the ETSI-ratified test methodology. (SRG conducted some of the tests, but Rohde & Schwarz also conducted most of the drive-testing to collect the data for the report, due to covid-19 concerns.)
So while the results are relevant for what they are (outdoor drive-testing-based evaluation of the three national carrier networks, using three Samsung Galaxy S20+ smartphones, with 115 hours of testing, 5,071 kilometers of driving, 1,300+ voice calls per network, and 60,000+ data-related tests including HTTP browsing, video, social media, HTTP file transfers and capacity tests, as well as interactivity tests), they also contrast with rankings from other testing companies because they are using a standardized and public test methodology, not a proprietary one.
The results of the testing, by the way, were: AT&T ranked with the best network performance overall, followed by Verizon and then T-Mobile US. Voice performance among the three carriers was practically equal, it was the data performance across multiple tasks that was the differentiator among them.
SRG’s analysis made a number of interesting points on speed versus performance. “Peak data speeds, especially when measured to a server collocated in an operator’s network or in close proximity to an operator’s data center, have little bearing when determining the network which will deliver the best user experience with more typical use cases (small data transfers, web browsing, video, social media, eGaming, etc.),” the report says. “Likewise, an operator’s network with the most 5G NR coverage doesn’t immediately equate to a tangible net gain in performance. … Furthermore, in many cases the operator with the most 5G NR network coverage delivered a poorer user experience when the smartphone used
5G NR or a mix of LTE and 5G NR in a given test than when the smartphone merely relied on LTE.”
One particularly novel aspect of SRG’s testing was that it involved interactivity testing — meant to mimic a consumer’s interaction with highly sensitive, real-time apps such as mobile gaming. That type of testing is still being hammered out in standards groups, according to SRG and Andreas Roessler, technical marketing manager at Rohde & Schwarz.
“What [SRG} did is basically pioneering testing in the networks, based on something that will be added to the recommendation,” Roessler said.
“The testing we did in interactivity, there were big differences across the networks — but then, none of them really did well compared to results we’ve seen in other markets,” Thelander said, adding that the implementation of network slicing could improve the ability to support real-time applications.
Emil Olbrich, SRG’s VP of network technology, discusses the testing further in the video linked below, and the preview of the SRG report can be seen here.