Warning: this gets a bit meta
There’s a game I play pretty much every time I see a technology-related announcement. Case in point, last week I saw something from a vendor which pretty much said, “Now with extra security!” (I am racking my brains who it came from, but no matter, it was reasonably standard). To the game: turn the sentence around and see what thought processes that triggers — in this case, “You mean, before, it had less security? What the heck?”
To be fair, it’s not just me. Many years ago I was particularly taken by a cartoon involving the lasagne-eating Garfield, watching the ads on television. “New and improved! New and improved!” the ads blared out. “To think that, up till now, it was all old and inferior…” the cat opined.
Three dimensions spring to mind. The first is that, yes, it’s just advertising: each release has to be presented as better than before, otherwise people won’t buy it. The second, that the tech industry is changing and progressing, making improvements inevitable and welcome. And the third is that, yes, the previous generation of the product or service really was a bit rubbish.
To shift it up (or down) a level, the whole thing reflects the weight we put on the current narrative. “It’ll be nice when it’s finished,” I sometimes say, without actually joking: behind the humorous facade is a genuine belief that whatever technology has planned for us, this is not ‘it’. We are not there by a long chalk, yet, and strangely, we try to act as though we are.
Nor do I particularly believe that we are on a journey of some form, though it clearly does feel that way. A journey implies some kind of general direction and route; whereas the progression of technology is more like participating in the multi-dimensional bastard offspring of Cards Against Humanity and the Mousetrap Game. For sure, we are moving, but with no clear grasp of the rules, or what is around the next corner.
In large part, we accept the change that is happening to us; or we ignore it; or we wring our hands about it: the one option not available to us is to stop using it, for fear of being considered a luddite or for the simple reason that we quite like it, really. And thus we rely on more superficial narratives that carry us forward, that keep the money flowing, that help others get their heads around it all.
Don’t get me wrong, I’m feeling neither negative nor cynical. I am, however, feeling that we are acting as the passengers of complexity theory, focusing on the knowns as the unknowns are too much of a challenge. Some pretty deep questions pervade, not only in terms of ethics or governance (which I bang on about often enough) but also, for example, on the nature of augmentation versus automation, the impact of insight on responsibility, the ability to game our natures, the alignment of interfaces and personality.
I could drill into each of these (indeed, ahem, watch this space) but more important is the fact that they should be part of the more general narrative, but they aren’t, not really.
<* I recognise several weaknesses in this argument, in that I am of course looking for some kind of philosophical perspective on technological progress. First that of course, people are discussing these things: I’m just not party to those conversations. And second, what we might call the framing effect: another issue I’ve seen frequently, where someone stumbles upon a way of looking at things and then, post-epiphany, can’t understand why the rest of the world just won’t do the same BECAUSE IT’S REALLY IMPORTANT. But anyway…*>
Simply put, we could be modelling the kind of outcomes we are trying to achieve from the use of tech. I’m reminded of the Security By Design lobby, which says grosso modo that we should be thinking about security needs at the very outset of creating something new. More broadly, I’m wondering whether, with a few flipcharts and post-its, we can get to a kind of “this-is-a-set-of-principles-we-can-all-adhere-to-by-design” model which goes beyond notions of security, trust, governance yadda yadda and gets closer to an alignment with who we are.
I don’t think the answers would be obvious or immutable. but at least we could have something to build towards, rather than the current acceptance that technology is something to be accepted as delivered, whether we like it or not. Often (case in point: social media) the answer will be both, but then at least we can make more informed choices.
Thanks for reading, Jon