Why does that broadband industry, supposedly a “high technology” one, lag behind old and largely defunct industries that now have reach the “museum piece” stage?
Last week I was in the National Slate Museum in Wales watching slate being split apart. On the wall were sample pieces of all the standard sizes. These have cute names like “princess”. For each size there were three standard qualities: the thinnest are the highest quality (at 5mm in thickness), and the thickest have the lowest quality (those of 13mm or more). Obviously a lighter slate costs less to transport, and lets you roof a wider span and with less supporting wood, hence is worth more.
These slates were sold around the world, driven by industrial revolution and need to build factories and other large structures for which “traditional” methods were unsuitable. Today we are building data centres instead of factories, and the key input is broadband access rather than building materials. Thankfully telecoms is a far less dangerous industry, and doesn’t give us lung disease that kills us off in our late 30s. (The eye strain and backache from hunching over iDevices is our deserved punishment for refusing to talk to each other!)
What struck me was how this “primitive” industry had managed to create standard products in terms of quantity and quality, that were clearly fit-for-purpose for different uses such as main roofs versus drainage versus ornamental uses. This is in contrast to broadband where there is high variability in the service, even with the same product from the same operator being delivered to different end users.
With broadband we don’t have any kind of standard units for buyers to be able to evaluate a product or know if it offer better or worse utility and value that another. The only promise we make is not to over-deliver, by setting an “up to” maximum burst data throughput! Even this says nothing about the quality on offer.
In this sense, broadband is a immature craft industry which has yet to even reach the most basic level of sophistication in how it defines its products. To a degree this is understandable, as the medium is a statistically multiplexed one, so naturally is variable in its properties. We haven’t yet standardised the metrics in which quantity and quality are expressed for such a thing. The desire is for something simple like a scalar average, but there is no quality in averages.
Hence we need to engage with the probabilistic nature of broadband, and express its properties as odds, ideally using a suitable metric space that captures the likelihood of the desired outcome happening. This is by its nature something that is an internal measure for industry use, rather than something that end consumers might be exposed to.
Without standard metrics and measures, and transparent labelling, a proper functioning market with substitutable suppliers is not possible. The question that sits with me is: whose job is it to standardise the product? The regulator? Equipment vendors? Standards bodies? Network operators? Industry trade groups? Or someone else?
At the moment we seem to lack both awareness of the issue, as well as incentives to tackle it. My hunch is that the switch-over to software-defined networks will be a key driver for change. When resources are brought under software control then they have to be given units of measure. Network operators will have a low tolerance for control systems that have vendor lock-in at this elementary level. Hence the process of standardising the metrics for quantity and quality will rise in visibility and importance in the next few years.
For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.