The UK’s mobile networks are transforming rapidly to meet the growing needs of individuals and businesses. In both spheres, usage of cloud-based instant messaging, data storage, streaming, and other data-heavy services has skyrocketed (as well as the volume of content associated with these types of services). UK operators appear to be tackling this demand head-on — EE has become the first European MNO to purportedly reach a near Gigabit performance on their LTE network. They recently demonstrated download speeds of up to 765 Mbps and uploads of 110 Mbps, during a trial phase at the U.K.’s national soccer arena, Wembley Stadium. Meanwhile, O2 and the Wireless Infrastructure Group (WIG) have installed the UK’s first fiber-connected small cell network in Aberdeen, Scotland – enabling O2 to provide advanced 4G services now while getting a head start on the infrastructure needed for deployment of 5G.
In North America, Dual Band LTE is gradually giving way to LTE-A which is becoming available across Canada and the US. Users will theoretically have access to speeds of up to 750 Mbps, with average speeds ranging between 22 to 174 Mbps. Soon enough, the 2G spectrum will be shut down altogether and will be repurposed primarily for 4G, while 5G will be added on top as an even faster layer. Before we get ahead of ourselves, though, an important question for businesses and consumers to ask themselves is how much bandwidth they really need to carry out their day-to-day tasks, at the moment; and just how reliable is the connection – critical concerns that may not be so easy to work out. Today, there is a morass of coverage checking, speed test, and problem reporting websites and apps available that one can get lost in while attempting to understand the strength of a network. Ultimately, however, real-world network testing performed by specialists (using controlled assessment methods combined with specialized equipment and everyday mobile devices) is crucial if carriers and their customers want an in-depth understanding of what speeds, voice quality, coverage, and reliability are available in real life.
In terms of real-world testing, there are two main types that are widely used by carriers when network performance is measured. Collecting and reviewing crowdsource network data can provide a snapshot of network performance, offering visibility on surface level processes. This method, however, is somewhat uncontrolled and includes the pooling of data from a variety of devices in terms of type, age, firmware and operating systems; and it includes many other random variables involving time, location, etc. – all of which creates a wider spectrum of variables to analyze. This can impact the accuracy of the results and is not considered the most precise way to assess the strength and reliability of a network’s performance.
The other method is controlled benchmark testing (using scientific methods and processes combined with test equipment to perform drive, walk, and in-venue performance tests). This form of testing also involves gaining access to key device metrics (such as Layer 3 data) not available in crowdsource testing. This is crucial for taking an in-depth view at how a network is actually performing, what factors are affecting positive and negative performance, and everything in between. Device types and test parameters are standardized throughout the testing process minimizing the amount of anomalous and erroneous data you might otherwise encounter in an uncontrolled testing environment.
The granular level of data that controlled benchmark testing provides becomes even more important when various metrics are loosely interpreted and promoted (for example, the results of capacity and speed tests). Some reports that make their way to the public confuse matters by presenting operator network data – crowdsourced or controlled testing – to consumers in a way which is not entirely representative of what they will experience. Promoting the highest theoretical network throughput speed available, for example, does little to suggest that the majority of users will ever receive that speed let alone coverage or reliability (all of which will vary across an operator’s network).
At GWS, our controlled benchmark testing focuses on customer experience – we compare data against the most common processes that users carry out on their phones, which enables greater insight into customer behavior and network performance. For example, we can tell which networks best support social media applications, and how these are functioning for users in any given area.
By assessing how different applications – like download and upload speeds, streaming, web browsing, and VoLTE calling – perform in different locations, we get a real-world picture of how networks are operating as consumers experience them. By using specialized test equipment installed in vans and backpacks, we can rigorously test mobile networks anywhere, and establish, for example, if a dropped call was caused by coverage, network interference, equipment or system failure, device side issue, or any variety of other factors. With the results of this diagnosis, operators can optimize their network – improving the quality and availability of their signal and ensuring better all-around user experience.
As the world’s wireless networks expand and improve, performance benchmarking is more important than ever. Real-world customer experience testing through controlled benchmark tests is the most comprehensive way to truly understand network performance at ground level, which is why we’ve been championing it for years.