The telecoms business has built its success on the back on advances in physics. This is turning into a cultural liability as it becomes a component of a distributed computing services industry.
In theory, the telecoms industry is pathetically simple to the point of utter triviality. It copies information at a distance, unchanged. The ideal telecoms service quite literally does nothing! All we do is build large spread-out supercomputers highly optimised to compute the identity function.
In practise, the telecoms industry is extremely complex and filled with richness. At its heart, the industry contains a three-way power struggle. Each gallant group is concerned with a different fundamental limit of the world:
- “Speed of light freedom fighters” – electromagnetism experts who construct resources to transmit data at a distance.
- “Brave bitstream breakers” – who put the “distributed” into distributed computing by using packets and protocols to share those transmission resources.
- “Bang for your buck brigade” – product marketers, back office developers, and business managers who look after the limited capital available for the above two activities.
These respectively reflect the cosmic, ludic and ecological constraints of telecoms. They are in turn naturally dominated by experts in physics, computer science, and legal/finance. After all, for every network element there is an equal and opposite regulatory lawyer. Nature insists on it!
Historically, the essential transition that got telecoms going was from the postal system (at the speed of pony) to the telegraph, telegram and telephone (at the speed of light, less encoding and decoding overheads). Doing “electromagnetism over a distance”, with minimal transformation, is an activity which naturally aligns to skills in physics.
As humans, we all inhabit sociotechnical systems that offer incentives and status as a reward for exhibiting “acceptable” behaviour. The top technical prizes in telecoms have traditionally accrued to those with a physics background (allied to strong numeracy). This started with Bell Labs and information theory, if not earlier, and continued through the invention of transistors and lasers. It didn’t stop for at least fifty years thereafter.
The heroes and heroines were clear: those who enabled a greater quantity of information to be delivered at or near the speed of light. Then as the telecoms industry developed, it became about delivering circuits with ever more throughput. On the fixed access networks this meant wave-division multiplexing of light and protocols like SDH. On mobile cellular networks we came up with clever antenna designs and mathematically-inspired technologies like CDMA and OFDM.
This kept those with a physics mindset in the forefront of the industry, and rightly so. The problems to be overcome were in turning essential insights in the properties of light and radio waves into network mechanisms to channel these. Telecoms was transmission, and those circuits emulated a dedicated point-to-point transmitter.
Then in the 1970s we began to attach computers to data networks everywhere. At the same time, we experimented with a subtle change: from fixed time slots in circuits, to variable ones with packet data. This ostensibly was a minor modification to the existing paradigm, so as to accommodate the bursty nature of computer data. The innate variability of data demand made traditional circuits unaffordable, and the arbitrage of statistical multiplexing economically irresistible.
Everything looked so similar… so the belief and reward systems went unchallenged and unchanged. Protocols like TCP/IP try to maximise throughput, searching for the “space” in the “pipe” in order to fill it with the maximum quantity. We sent as many packets as possible as fast as possible, choosing to use what are called “work conserving” queues. We even reused the same term of “bandwidth” to refer to the resulting “flow” of datagrams.
The “stupid network” was to attempt to contain this “packets and protocols” computing service in the familiar “pipe” metaphor. This made it safe and understandable by physicists, since it was still an apparent logical fit into the transmission model. This preserved the seeming relevance of their expertise and resulting social status. It also alleviated anxiety and an unconscious ambivalence about these peculiar packets and their unnatural performance behaviour.
However, what happed (in a way that is largely unnoticed) is that it then bred a culture of “physicism”. This is a play on “scientism”, which is a misplaced faith in science having the answers to all of life’s big questions. Correspondingly, “physicism” is the (false) hope that the philosophical tools of natural science answer all the important questions about telecommunications.
Now, I know that many of you reading this are physicists. At the risk of causing unintended offence, I believe Carly Simon also wrote a poem on “I bet you think this blog is about you”. It’s not about physics or physicists, so it’s not about you at all. Indeed, no judgement about any individual or class is implied, whether positive or negative. Too err is merely human, and to do it at scale it helps to have networked computers.
What it is about is a flawed belief system that underpins a systemic cultural issue. We failed to notice how broadband was a fundamental break with the past, and was the beginning of a fundamental transition to a new paradigm. The result is a “cult of bandwidth”: a doctrine with quasi-mythical dogmas, such as the nonsensical “end-to-end principle”.
When we use the imprecise term “broadband”, it conflates two things: the move to high-speed transmission (very physics-friendly), and the introduction of distributed computing technologies (not so physicsy). This is a basic change: from constructing the resource, to dynamically sharing it. It represents a shift in the fundamental constraint, from the cosmic to the ludic.
Whereas before “success” was to create as much resource quantity as possible, in the computing world it is to allocate as little resource quality as tolerable. In for former, we maximise the “information flux” to create value; in the latter paradigm, we wish to minimise it, to contain cost. The “polarity” of the telecoms game has (somewhat paradoxically) inverted.
This is a slow and subtle shift, and was initially hidden by a transient period where it was possible to deliver both better quantity and quality, and rapidly lowering cost. This was done simply by riding on the back of the hyper-growth of resource capacity, as delivered by those clever electromagnetism experts. This historical event is not to be repeated, and has declining returns (which are soon to be negative).
Its legacy is that our “marketing compasses” now point to a magnetic “broadband speed test” north that is nowhere near the “profit maximisation” pole. There is no Moore’s Law for networks as complete systems, so the divergence will only grow as we move to the business “polarity inversion”.
This means we face an increasing dissonance between a “physicism” culture of the telecoms industry, and the growing dominance of a distributed computing and cloud-driven reality. There are hard ludic limits that improved cosmic-busting technology cannot overcome. Whilst herbivorous culture may eat grassy business strategy for breakfast, carnivorous mathematics dines out on human culture steak for dinner every night.
For the latest fresh thinking on telecommunications, please sign up for the free Geddes newsletter.