Anyone who has read my written work will know I’m a bit of a one-trick pony. I start with some cultural reference or anecdote, then segue into a more technical or managerial point. Over the years I have found myself using the same stories more than once: in particular, the moment when Dave Lister finds that he is the only human left alive. Here’s its most recent outing, in relation to GDPR (of all things).
I’m saying this apologetically, because I find myself urged to use it again. For months now, since being involved behind the scenes on a pretty big project on the topic, I have been wanting to write about Network Function Virtualisation (NFV) and its impact on telecommunications. However, for some reason I have found I have nothing to say. Which (for a pundit) is more than a little worrisome.
It’s not as if there’s nothing to talk about. For the record, NFV has emerged in the telco space as a new way of creating communications services: rather than having custom-built hardware and software for each specific need, why not follow the lead of data centres, bring in a virtualised infrastructure stack and be able to support any new software function with minimal hardware change? What’s not to like?
The answer, as many telcos are discovering, is that it isn’t as simple as that. Challenges of ripping out and replacing an entire global telecoms infrastructure couple with the fact that earlier iterations of NFV stacks have not been, shall we say, fully market ready? Money has been spent, disappointment felt and a mood of reticence has settled across large parts of the industry.
There we have it, in a nutshell. A good idea meets a complex world, working brilliantly for some and causing problems for others. Fall back, regroup and try again, a bit more slowly. Like my anecdotes, we’ve heard it all before. Which goes a long way towards understanding why I’ve struggled to write anything coherent about it.
But it has refused to leave my head, stuck like an unfinished conversation. The issue is not that its familiar territory, but how we are lulled, somehow, into believing that anything we do in tech is in any way real. Of course, the industry is rife with mind-bending physics and remarkable engineering, but its ultimate purpose is to create and manipulate representations of things.
It’s all virtual, all of it. Every A-to-D converter, every API, bootloader and on-screen widget is a façade, designed to disguise the few trillion transistors, gates and fibres that lie behind its slick interface. This isn’t some glib philosophical point, as it lies behind a great deal of procrastination and distraction in corporate meeting rooms and indeed, online columns.
Any conversation that starts with the idea “technology idea X doesn’t work” is flawed: we can say it is poorly configured, or that it doesn’t work with existing practices, or that existing practices are a poor fit to what it does. What we should not say, however, is that it is a bad idea in itself: to do so implies that it exists in some tangible way. By treating an idea as a thing, we stop focusing on how well it is designed.
Specifically on the topic at hand, NFV is a work in progress. It will improve, for two reasons: first its inherent structure will be improved, in functional and quality terms, as it is used and understood. And second, as new functions are developed we see proper innovation, rather than just building the old services (e.g. messaging, email, voice) in new ways.
More broadly, it was always about virtualisation. While NFV might have been slowed by both corporate inertia and hardware vendor lock-in (loosened, but not removed by NFV’s sibling trend, software-definged networking), we are moving inexorably towards putting the control in the software, rather than tying it into the hardware. While we may all experience Lister-levels of incredulity from time to time, this doesn’t make reality — or indeed virtuality — any less true.
Moving on, here’s a couple of articles I have actually been able to write.
5 questions for… ADLINK. Edge-Fog-Cloud?
ADLINK is a little-known company outside of its own space, but is right in the thick of the increasingly distributed world we are seeing come out of IoT. Here I speak to Steve Jennis about where this space is going. In conclusion, as we move from a centralisation to a distribution wave, the cloud providers are going to have their work cut out.
Is Enterprise DevOps an Oxymoron?
Oh, I long for the days when I was working things out for the first time, those wonderful moments of epiphany… these days I linger somewhere between discovery of the wisdom of others, and the knowledge that (to pick another over-used cultural reference) there is nothing new under the sun. In this article I introduce a report I am currently researching, on how DevOps can make a difference at scale, in the enterprise environment. All thoughts, stories and guidance very welcome.
Extra-curricular: Super-awesome, the Musical (Redux)
The first draft is now complete. No spoilers but the final song is called “Perfectly Illogical.” Now all I have to do is write it through another few times, but that will be less taxing than getting it to here. Famous last words…
Thank you as ever for reading. This bulletin now has over 350 opted-in recipients, and a small number of people (20-ish) I am still to contact. If it’s not your thing, please feel free to unsubscribe at any time.
All the best, Jon
I confess to have hesitated before I used the word ‘the’ in the title to this bulletin. After all, that’s quite a statement. Nonetheless, these seven principles have been plaguing me for some time in one way or another. As always, I’d welcome any views you have.
The law of falling thresholds
Advances in the fundamental building blocks of technology — processing, storage and communications — reduce bottlenecks and make new things possible due to falling cost, power and size needs, in parallel with increased capacity, capability and bandwidth. The Internet of Things, for example, is a manifestation of how sensor-based remote management and control can apply to whole new areas, once it becomes affordable that is. In turn, this enables new practices and models such as pre-emptive maintenance or competitive fitness apps.
At an infrastructure level, falling thresholds are enablers to new approaches to storage and processing, driving specifics such as in-memory Apache Spark, and more general trends like cloud computing. As 5G becomes a thing, so we will see vastly increased bandwidth, making such things as streamed augmented and virtual reality possible — should anybody want them, of course!
The law of self-fulfilling prophecy
Like the technology it creates, the technology industry is constrained by both financial and engineering limitations, which means it has to set priorities. Frequently, these are put in place based on supply and demand: if the market for RAM increases, so will finance be found to create more of it. At the same time, priorities can be influenced by agendas, charisma, personal drive and other forms of influence.
A positive example of this is Moore’s Law: when an entire industry gets behind a single theme, putting the necessary research behind it, so it is possible to maintain a steady level of progress across, well, decades. We’ve also seen the single-mindedness of people like Steve Jobs, who drove the market for tablet computers through seeming force of will, and of course Elon Musk and Jeff Bezos.
The law of potential differences
Over recent decades, many of the most exciting breakthroughs in technology-led business models have resulted from spotting new connections to enable value exchange. That is, I give you something, and you give me money back. These are dressed up in clever economic terms but boil down to the same set of questions: what can I give you that you are prepared to pay for, and/or how can I short-circuit existing business models?
The result plays to the agile startup: given just how slowly old corporations move, if I can do something new quickly, I will be able to siphon off a bunch of money and grow to such an extent that I will have established myself by the time they catch up. E-business, disintermediation, Uberisation, the Network Economy are all manifestations of this same principle.
The law of exponential complexity
This can be formulated in a number of ways. For example, that the amount of data that we create will always exceed the amount of processing we have available. Or that the needs of the devices we have to manage, in terms of volumes, rate of update or upgrade or will always exceed our capacity to manage them. Or that the attack surface will always be greater than our ability to secure it. Or that the photos we take will never be tag-able in a meaningful way.
However it is framed, the consequence is always the same: that the hopes we have — we the business, or IT management, or home user — live in a constant state of forlorn hope, that the next generation of technology will solve what remain some pretty fundamental issues (manageability, security, insight delivery). Only to find that next-gen tech creates as many new challenges as it purports to solve. So, on we go.
The law of inflating expectations
A counter to the law of falling thresholds is that a technological advance can very quickly become a default, rather than an exception. We only have to look at the progression of video, from a minority owning expensive, tape driven camera equipment, to a situation where capturing and uploading video has become a blight on music gigs, and indeed a disappointment if it is not possible for whatever reason.
In turn this drives the law of exponential complexity, as our default behaviours result (for example) in generating far more video content than we can fit on our two-year-old smartphones. Again, this is a law which can be ‘leveraged’ by technology companies: by getting the customer base to see the new as the norm, it drives new spend as the old very quickly becomes inferior.
The law of unintended consequences
Innovation is no longer in the hands of the technologically savvy few, as individuals create whole new ways of using tech that were never part of the plan. Often these are positive, such as geocaching; equally, they might drive the use of a new technology to its absolute limit, driving designers to distraction and feeding the law of inflating expectations.
And frequently they can have negative consequences. Each new generation of technology creates new ways to extricate money from people, which is why we have a whole industry around cybersecurity, to counter a whole industry around cybercrime.
The law of innovation decay
Any innovation has a sell-by date, as over time, contextual requirements will move to a state which make the innovation a poor fit to the situation at hand. Individual solutions can never change as quickly as the problem spaces they serve, in many cases as hardware, software and communications are overtaken by the very complexity that they create.
In part this is a consequence of the law of self-fulfilling prophecy: the push to create new things inevitable drives things out of date more quickly. It also results from the laws of both inflating expectations and unintended consequences. Some device vendors have exploited this law through designed obsolescence, accelerating the point at which a device will become redundant.
All the best, Jon