Some DSL (Digital Subscriber Loop) providers have certainly led the industry in communicating rates for their service that are unlikely to ever be experienced by their customers, with statements such as “Speeds upto 50 Mbit/s,” note that all-important ‘upto’ term. Ofcom in the UK has started an investigation into this practice. Physical loop limitations mean that unless your house is next to the local exchange, the actual throughput will be much lower, typically one half to one quarter of that rate.
Earlier this year I did a quick survey examining the mobile industry and its claims as multiple wireless technologies are offered to customers. The US is focusing upon average download speeds and defining the conditions upon which those speeds are achieved. The rest of the industry uses ‘upto’ with in some cases guidance on what can typically be expected.
For DSL, through-put is dependent on loop-length, noise at the local exchange (generally Gaussian), concentration in the carrier’s network, and congestion on the site being accessed. For example, at home I know I get 2.8 Mbit/s rain or shine. However, for my mobile data service the throughput is dependent on my distance from the tower, the number of people simultaneously using that tower, whether I’m indoors, the weather (fading is Rayleigh not Gaussian, so sometimes it just does not work), which device I’m using, and what smarts the operator has in their network to squeeze the data; as well as the concentration in the carrier’s network and congestion on the site/service being accessed.
My experience with throughput on my mobile service is definitely more variable. This raises the question does average download speed mean anything? Upto speeds have been used in playing marketing games about size. When really what matters is how long it takes to download an email with a 1MB attachment, or how long will it take for the BBC news website frontpage to download, or how often will a webpage or email download timeout? It’s definitely something that an independent agency, such as JD Power and Associates or Consumer Reports, should be evaluating and comparing across operators as mobile access to the Internet becomes as pervasive as mobile voice services. In an ideal world it would be nice to see a number of measurements around a region, for example NYC metro area, comparing operators’ performance with mean and percentage of measurements above some ‘adequacy’ user experience level.