Bulletin 21 December 2018. The loneliness of the cybersecurity professional
Where the bad things are
I do feel sorry for anyone that works in cybersecurity. They’re like the people who thought bicycle helmets were a good idea before anyone else thought they were cool, or like anyone in health and safety, or quality, or any other discipline that (from the outside) looks like it’s trying to get in the way.
Worst case, security professionals are lumped into that humourless group that wants to prevent people, businesses, whoever from getting on with their lives. Like, you know, traffic wardens. Unlike whom, it’s the security guy that takes the hit when things go wrong. As I say, it’s a thankless task.
It’s the security professional’s job to predict what might go wrong, and therefore create an opportunity to do something about it in advance. What a task, when everything you say is a hair’s breadth from doom-mongering. It’s one of the reasons I advocate visibility as a precept: that is, tell the board what the risk is, and then let them decide whether to play safe, or fast and loose.
Here’s the idea: in business leadership, one of the main jobs is balancing risk — in general this is measured financially, although business leaders are not alien to the concept of saying “what the heck” and just doing something anyway, based on gut feel (indeed, this increasingly feels how the West is run, but I digress).
So, forewarned is fore-armed. Trouble is, in many cases we just don’t know what the potential consequences are. When writing about data privacy in recent years, I have talked about aggregation and peripheral risk — that is, what if a data sample makes it look like you were in the vicinity of a crime?
Such risks still exist. However and unfortunately, I failed to consider the possibility of using harvested behavioural data to target incite-ful ads that have had an unexpectedly strong influence on our democratic processes; nor did I predict the use of such techniques by foreign powers. I wasn’t alone.
Our inability to spot such world-changing consequences of our technological use further undermines the credibility of the crystal-ball-gazing enforcers we call cybersecurity experts. Which does lead to models such as cyber-recovery planning, that is, “plan for failure as it’s all going to go wrong anyway,” but can also feed a “why bother, as it’s all going to go wrong anyway” attitude.
Security pros are thus between the rock of a seemingly intransigent, uncaring business, and the hard place of failing to spot some future cataclysm. Yes, I do feel sorry for them.
On a more cheery note, here’s an article for this week.
AWS Re:Invent 2018 Reflects an Industry Coming of Age
Technology is at an interesting juncture, as we move from a wave of fragmentation and explosion to one of standardisation and simplification. The former was caused by the joint forces of cloud infrastructure and open source software, and the latter is happening because just using either his no longer a differentiator in itself. That’s my take anyway, with consequences for even the biggest cloud players such as AWS.
That’s all for now. It just remains for me to wish all of my readers a very happy Christmas (don’t eat too much plum pudding), and see you next week. Thanks for reading!
Jon