As a result of intense lobbying, the EU has passed a set of populist ‘net neutrality’ laws as Regulation (EU) 2015/2120. These laws promote a false doctrine of fairness to packets, in the name of ‘non-discrimination’. This confuses an unintentional allocation of resources to packets with intentional justice for people.
The National Telecommunications Regulatory Authorities (NRAs) are now tasked with implementing these EU-level laws. The problem is that these laws are technically incompetent: they fail to reflect the quintessential stochastic nature of broadband and its constraints. It is the mathematical equivalent of demanding faster-than-light communications, or legislating the value of π (again).
As a consequence, these laws are effectively unimplementable. Any endeavour to enact them will result in inevitable and avoidable ‘iatrogenic’ harm to customers. This is where the treatment causes more suffering than the disease it purports to remedy.
This sets up a nightmare for the 28 NRAs, who are now caught between their obligation to uphold the law and the unyielding mathematics of multiplexing. One of them has to give, and the mathematics isn’t negotiating. So what should the NRAs do?
What the law says
The law is emphatic that “Providers of internet access services shall treat all traffic equally”. The use of (implicitly differential) traffic management is to be absolutely minimised, i.e. measures to “mitigate the effects of exceptional or temporary network congestion” can sparingly be applied and “shall not be maintained for longer than necessary”.
Knowing that “best effort” by definition has unpredictable performance, there are rules for “optimised services” that have more predictable quality characteristics.
Why these laws are problematic
There are many practical and philosophical reasons why these laws are misinformed and potentially malignant. I want to pick on the most important one, and save the detailed line-by-line critique for another day.
The core harm is that there is active discouragement to engage in “resource trades”. This is in direct contradiction to the purpose of broadband: to enable statistical sharing via allowing contention. You minimise cost and maximise QoE by engaging in as many trades as possible, not as few.
The false assumption being made is that you can always economically provide enough idle assets to maintain quality, when you can’t. The unintended consequence is a conflict between two outcomes, neither of which is desirable:
- Either spend and charge more (without bound) – which is bad for customers; or
- If you can’t spend more, let the quality drop arbitrarily low – which is bad for customers.
In other words, these regulations assume both a quality floor exists, and the service can be made to work well for all people all the time. This is the road to statistical multiplexing hell.
The NRA’s dilemma
The newly constrained regulatory space is now in tension with each NRA’s underlying duty of care to citizens, by trying to enforce a quality floor by overprovisioning. This avoids the real issues of fitness-for-purpose and addressing the growing diversity of demand.
Specifically, the requirement to be “non-discriminatory” at the packet level is fatally flawed for three reasons:
- It is not objectively defined or definable. What arrival and departure traffic patterns are “non-discriminatory”? Nobody knows.
- It is not observable in practice, as amply shown by Ofcom’s research.
- It is not even relevant to delivering fair and just customer outcomes! Only the (emergent) global end-to-performance matters, not the configuration of local mechanisms and policies.
By discouraging rational technology choices and economic decisions, these net neutrality regulations are “carcinogenic” – in the sense that they arm a real risk of harm to users. This is not a theory. My associates have seen examples of pre-existing harm caused by “neutrality”:
- A large operator unable to produce an ISP service for low-income groups due to fears of being accused of “discrimination”.
- Another large operator facing a service collapse due to “neutral” packet handling.
As entrepreneurs in this space we also note that we face direct costs of determining the impact of these regulations and achieving compliance (as best we can, given their ambiguity). So much for “innovation without permission” and supporting small broadband service provider businesses!
The NRAs now face a dilemma: how to reconcile these laws with their instructions to protect consumers and support the development of the broadband industry and ICT services in general?
The NRAs face an unenviable task
The NRAs now face the task of individually and collectively sorting out the mess. There has to be a process of negotiation to realign the policy implementation with the technical reality of statistically multiplexed networks.
The alternative is to allow the current risk to turn into significant harm to consumers. This will undermine the economic basis of the broadband industry, encourage consolidation, and drive out smaller players and new entrants. The foreseeable nature of the harm will also damage the legitimacy of the NRAs and the EU’s regulatory framework.
Furthermore, undermining the sustainability, performance and development of data networks has wider consequences for industrial and social policy. Broadband’s aspiration to become as full-blown utility alongside water, electricity and gas is at stake.
What should the NRAs do?
The real issues facing national regulators are:
- How to foster competition, so that traditional and proven approaches to abuse of significant market power can be applied?
- How to create transparency for the overall quality and fitness-for-purpose on offer, so that users can determine which services meet their individual needs?
- Where there is competition, how can switching times and costs be minimised? After all, in the dial-up world the next ISP was only a phone call away.
Regulators are responsible for the consequences of their choices. In the specific case of broadband access, rationality requires us to make ‘resource trades’ for networks to be economically feasible. This implies there should be different quality floors, and a market price for quality. It may even be necessary to enforce a quality floor on players with significant market power.
The only alternative to market pricing is unpopular rationing through means like data caps. Indeed, ‘net neutrality’ is form of rationing (via a hoped-for magical self-optimisation of networks).
What are the next steps for the NRAs?
The are three key steps that NRAs must take if they are to extricate themselves from this ‘neutrality’ nightmare.
Firstly, internal education. Spectrum policy is made on the basis of the physics of electromagnetism that is well understood. The broadband industry is immature, and its scientific base is still forming. NRAs (and operators) need to understand the true (stochastic) nature of broadband, and how it relates to the policy options they face.
Secondly, validation and dissemination of the science. Regulators should validate the science with their national experts. I suggest that BEREC should create a reference model of how the network operation relates to the customer experience, and what the constraints are. Regulators also need to pick the right quality metric, and it would make most sense for this to be done collectively.
Thirdly, engagement with consumer advocates. The concept of ‘neutrality’ has been sold as being an effective means of protecting users from harm. Consumer advocates need to be informed that they have been fooled, and that the result will be the exact opposite. A quality floor offers both transparency and protection, whereas ‘neutral’ traffic management offers neither.
The good news is that the EU law does provide a framework for implementing a quality floor, and that by shifting emphasis a technical and political solution can be found.
For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.