Kent Interdisciplinary Research Centre in Cyber Security (KirCCS)

Blocking extremist sites is not the same as fighting child porn

2 December 2013

An Article Image

Expert comment by Eerke Boiten, School of Computing

Fresh from its success calling on search engines to block access to child porn, the UK government is turning its attention to terrorism. Ministers are poised to call on internet service providers to block access to websites that contain extremist views.

The government first set out its intentions last month when is said its Extremism Task Force would seek to "counter extremist narrative, including by blocking online sites".

Only a week ago, the implementation of filters in search engines in order to combat child porn raised concerns on censorship. This latest step may well vindicate these fears.

To start with the good news, the proposed technical method to achieve this appears reasonable. A specialist taskforce is to monitor internet content and, with the help of the public, flag up dubious websites. They will then follow up by potentially having the websites taken down and instigate criminal prosecution where appropriate. This seems like a sound strategy against websites representing illegal activity.

The technique is already applied in the context of copyright infringement with some success and has been used by the Internet Watch Foundation for many years to take on child porn sites.

Experts such as Jim Gamble, former head of the Child Exploitation & Online Protection Centre, have argued that this is much more effective and to the point than installing search filters. With extremism, as with child porn, one might still argue that the hardened criminals will know ways of circumventing simplistic internet controls, through use of ToR, VPNs, and encryption in general.

Unfortunately terrorism is a much more problematic area to police than child porn. The use of the numbered scales that categorise severity of images of child porn images is well established in criminal courts. The definition and scope of extremism and terrorism is much harder to characterise, and is highly subjective even in mainstream politics. Consider the public and political views on Nelson Mandela in Western Europe over the past 40 years, for example.

A critical point here, as so often, is going to be oversight. Who would be in charge of deciding what is terrorism and extremism? How would they be accountable? It has been suggested that responsibility might be placed with the Police's Counter Terrorism Internet Referral Unit. This seems likely: its web portal already bears great similarity tothe IWF's.

Do the UK public trust accountability in the police sector? Trust in intelligence services in general, and the degree to which they are being held accountable through democratic processes, must certainly be at an all-time low.

A lack of belief in oversight is not the main worry, though. David Miranda was detained under Terrorism legislation in response to the Guardian's Snowden revelations, and terrorism legislation is routinely used to control protest demonstrations. We need to worry about this government abusing "terrorism" to suppress legitimate enquiry and protest, and for that reason alone we should not give them any more control of our biggest avenue of free speech: the internet.

Eerke Boiten is a senior lecturer in the School of Computing at the University of Kent, and Director of the University’s interdisciplinary Centre for Cyber Security Research. He receives funding from EPSRC for the CryptoForma Network of Excellence on Cryptography and Formal Methods.

This article was originally published at The Conversation. Read the original article.

Back to News

Kent Interdisciplinary Research Centre in Cyber Security (KirCCS), University of Kent, Canterbury, Kent, CT2 7NF

Enquiries: +44 (0)1227 824180 or contact us.

Last Updated: 30/11/2018