Social media owners need to stop running uncontrolled playgrounds

2016-01-05

Just how nasty can social media get? Twitter in particular has found itself under the cosh, due to its historical complacency. “We suck at dealing with abuse and trolls on the platform, and we’ve sucked at it for years,” the company’s previous CEO Dick Costolo wrote back in April.

When Jon Ronson wrote a book about the dangers of ’shaming’ on social media it was to Twitter that he was largely referring. While more extreme cases have resulted in criminal convictions, the majority of tweets fail to cross any significant threshold individually but can build a picture that, as Ronson points out, is more akin to the medieval punishment of stocks than rational debate.

The fact is, it is simply too easy to throw in verbal insults online. For example, whatever people felt about the politics of Andrew Lloyd Webber’s vote against the Tax Credits bill in the UK House of Lords, many remarks dwelt on his appearance rather than his decision.

And now with a falling share price, Twitter is stumbling into action, re-engaging founder Jack Dorsey as CEO and offering new capabilities that might prevent user numbers from flat-lining. Among the features is Moments, a.k.a. curated streams of tweets around current events. Hold on to that word - ‘curated’.

Twitter isn’t alone in finding that the dream of user-generated content is more Lord of the Flies than Paradise Island. Equally notorious for its crude, and sometimes cripplingly harsh comment streams has been YouTube. And indeed, across the Web we have platforms that have been sources of highly offensive, even abusive content.

Facebook, for example, remains a significant source of cyberbullying as teenagers use the service to display behaviours previously limited to the offline world. And Snapchat has been linked to sharing inappropriate images, with consent or otherwise.

It is interesting to compare the models of different sites. Whereas on Twitter most messages are shared publicly, on Facebook they tend to be shared with (so-called) friends. To state the obvious, this makes the former more of a platform for public shaming, and the latter for bullying within close communities.

It seems that every time somebody comes up with a way of sharing information, it invariably becomes misused. So is there any hope? Interestingly, we can draw some inspiration and hope from a site that appears to tend towards the chaotic.

Reddit, that lovechild of Usenet forums and social media, enables the creation of individual spaces (’sub-reddits’), each of which is curated by its creator to a greater or lesser extent. While parts of Reddit can get a bit hairy (in the same way as Twitter), at some level, humans remain in control.

The notion of curation — that is, keeping responsible people and the community involved — does seem to hold the key. For curation to be possible it requires the right tools.

The importance of the down-vote to Reddit cannot be over-stated, as it creates a generally accessible balancing factor. Right at the other end of the scale Quora (which also has both an up-vote and a down-vote) delivers on safe, wholesome, curated Q&A.

It is this additional level of responsibility that should set the scene for the future. With a caveat: nobody wants the Web to get “all schoolmarmish on your ass”; indeed, even if it could, it would doubtless cause people to run to the nearest virtual exit.

At the same time, we can see a future where we move from the uncontrolled playgrounds of social media (with the occasional, knee-jerk reaction from their respective authorities) to a place in which we take more personal responsibility for our actions.

The alternative is irresponsibility, either on the part of the individuals creating messages or the companies allowing it to happen. It like the old English adage, “The problem isn't stealing, the problem is getting caught.”

Simply put, our online culture needs checks and balances at all levels, not to restrict general behaviour but to prevent the excesses we exhibit if no restrictions are in place. It is no different in the virtual than the real.

While platform providers may not see it as their place to act as judge and jury — itself a point of debate — they should nonetheless provide the tools necessary to ensure people can congregate without fear for their online safety.

Not to do so is irresponsible, but more than this: as we develop and mature online, we will inevitably gravitate towards platforms that, by their nature, offer some basic protections against abuse. Many Twitter users are moving on; and it is surely no coincidence that Facebook is less popular for the youth.

Even as social media companies look to provide more ‘exciting' ways to interact, they ignore such basic, very human needs — such as existing without fear — at their peril.