Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

The push to ban Nazis on Twitter can just as easily affect the left – as the suspension of a Corbyn supporter proves

Tough rules intended to bring about a ‘safer’ and ‘more friendly’ internet have consequences. They will be misused. And yet if these tough rules are not enforced, the results may be even worse

Paul Bernal
Tuesday 05 March 2019 12:37 GMT
Comments
Labour MPs chant along with Jeremy Corbyn during PMQs as he lists social inequalities on rise

Yesterday’s suspension of the Twitter account @Rachael_Swindon caused significant distress in the left-wing Twitter community. This, according to many, was an attack on free speech, evidence of anti-left-wing bias, a silencing of dissent, and much more. Swindon’s account is a regular attacker of the Tories and a keen supporter of Jeremy Corbyn, strongly defending him against accusations of antisemitism.

That may have played a part in her suspension – though the details remain unclear, with Twitter themselves referring only to their “Abusive Behaviour Policy” which “prohibits behaviour which intimidates, harasses or tries to silence another user’s voice”. Her account was removed from its suspension this morning.

It is easy to see conspiracies here, to feel that your community is being victimised – and that this is all unfair and evidence of bias. It is possible, of course, that it is – but what is also true and more important is that this kind of thing is an inevitable consequence of the drive to make the online environment “safer”, to combat abuse and aggression. This drive is particularly strong and growing – covering both the attempts to remove “harmful” content from extremist websites and “fake news” to pro-suicide or pro-anorexia information, and banning what many would call “trolls”.

Last month’s Digital, Culture, Media and Sport Committee report on disinformation and fake news called for a new independent regulator of social media companies such as Facebook and Twitter – and the social media companies have been trying their best to stave off regulation by taking their own actions to deal with all of this.

That’s where the problems come in. These actions include stronger and faster systems to report abuse – because one of the major complaints about social media is that the bad stuff is never removed, or at least never removed fast enough. Many of those now in deep distress over the suspension of @Rachael_Swindon had previously been part of campaigns to remove the Nazis from Twitter – and celebrated when Stephen Yaxley-Lennon (aka Tommy Robinson) was banned from the platform.

The thing is, tools and systems designed to fight against the Nazis can be – and will be – used by Nazis, and against those on the left.

Part of this is justifiable – hateful speech, aggression and abuse can comes from all sides and indeed from the centre – but part of it is a consequence of how systems work. Create a tool, create a set of rules, and people will find ways to use that tool and to take advantage of those rules. As Cardinal Richelieu reputedly said, “if you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him”.

If you search through the tweets of a prolific tweeter – particularly one who tweets on contentious topics – it will not be that hard to find something that can be presented as breaking the rules. If you have enemies online, they will find ways to do this – and this will mean suspensions and bannings such as that which has happened to @Rachael_Swindon.

A great many “trolls” do not see themselves as trolls. They see their enemies as trolls, and themselves as something akin to freedom fighters – and as victims. If they’re on the right, they’re fighting against the oppression of the left, and vice versa. In complex and contentious areas such as questions surrounding transsexuality, this can get even worse. That means that the use of “anti-troll” tools is not just playing games or manipulating rules, but applying those rules and tools appropriately. For those seeking the ban of @Rachael_Swindon, she was an antisemite and a bully, so a suspension or ban was appropriate.

Tough rules intended to bring about a “safer” and “more friendly” internet have consequences. They will be misused. They will stifle free speech. They will create anger and resentment – and a kind of fortress mentality in certain communities – when they produce results such as the suspension of @Rachael_Swindon. And yet if these tough rules are not enforced, the results may be even worse. It is not easy, which is perhaps the most important thing to understand in this field. A great deal of care is needed, and there are no easy solutions to find.

Paul Bernal is a senior lecturer in IT, IP and media law at the UEA Law School, and the author of ‘The Internet, Warts and All: Free Speech, Privacy and Truth’, chapter 8 of which, Troubles with Trolls, looks at this problem in depth

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in