A Threat to Freedom: the EU and Online Terrorist Content

A Threat to Freedom: the EU and Online Terrorist Content

A Threat to Freedom: the EU and Online Terrorist Content

Hannah Bettsworth // 12 March2019

In the field of digital rights and civil liberties, all anyone’s talking about is the Digital Copyright Directive and saving memes. However, there may be a bigger threat passing under the radar of your average internet user. The Regulation on Preventing the Dissemination of Terrorist Content Online was published in September 2018, with the following main provisions:

  • Defining illegal terrorist content as “information which is used to incite and glorify the commission of terrorist offences, encouraging the contribution to and providing instructions for committing terrorist offences as well as promoting participation in terrorist groups.”
  • Introducing a one-hour time limit for removal of or disabling access to content when an administrative or judicial removal order is issued by a relevant national authority.
  • Requiring companies which host uploaded content to proactively find and remove terrorist material from their websites, including through automated filters.
  • Making companies preserve removed content as a safeguard against wrongful removals, but also for law enforcement purposes.

Firstly, there are technical concerns. The Directive, currently, includes cloud service companies which provide the infrastructure underlying data hosting.  They would find it technically impossible to comply. The Rapporteur, Daniel Dalton MEP, stated that the Commission told a committee hearing it had no intention of including such services in the Regulation and was attempting to cover Dropbox-style services.

That in itself raises questions about civil servants’ capacity to regulate the internet but is not the only technical issue facing the Regulation. The Hash Database is a voluntary initiative between Facebook, YouTube, Microsoft and Twitter to identify extreme terrorist content with unique markers in order to prevent re-uploading. Such content is classified under their terms of service, rather than legal definitions.

Technical filters cannot understand context: the Syrian Archive is constantly losing its YouTube uploads. Machine learning algorithms began to remove them faster than their team can back them up. Under the Regulation, the providers would have to retain the content for 6 months to assist law enforcement and to reinstate it if there was a successful appeal. In this case, human rights defenders would already have missed the opportunity to raise awareness and the atrocities would be long-forgotten.

As such, the Regulation’s provisions for upload filters would be a real risk to freedom of information and expression on the internet. The European Parliament’s Culture and Education Committee has recognised this in its opinion on the Regulation, calling for an exception for educational, journalistic or research material and removing financial penalties which would incentivise overzealous blocking. There is also no exception or flexibility for micro-enterprises who would find it harder to process such requests, which the Parliament has sought to rectify in its amendments.

The UN Special Rapporteurs for freedom of opinion, privacy rights, and human rights in counter-terrorism have also expressed serious concerns about the proposals. They note that the definition of online terrorist content is broad enough to omit the question of intent – in other words, your content could be removed for advocating or glorifying terrorism even if your intent was nothing of the sort. This concern may seem overblown, but it is a reality in some EU countries.

The UK has recently passed a law which makes it a crime to view or access material useful for terrorism despite having no terrorist intent, unless you are a journalist, an academic, or did so accidentally. A French far-leftist who explicitly opposed ISIS was sentenced to 18 months in jail for praising the Paris attackers’ courage in fighting the police. A vegan activist got a 7 month suspended sentence for stating there was ‘justice after all’ when a shop butcher was killed by a terrorist. In Spain, rappers Valtonyc and Hasel have been convicted of ‘glorifying terrorism’ based on lyrics about killing royal and political figures. A student was also given a suspended one-year sentence for joking about the ETA’s assassination of a Francoist ex-prime minister. This was later overturned and led to a spate of copycat jokes and memes.

There is also the problem of the Streisand Effect, whereby any attempt to suppress views will draw more attention to them. However, this policy is not merely politically counter-productive. It also rides roughshod over fundamental rights. It’s abhorrent to want to behead royals or to revel in the death of innocent shop workers, but freedom of expression is not only for the prevailing opinion.

This Regulation, as it stands, would institutionalise this worrying trend at the European Union level and legitimise the equally worrying trend of outsourcing decisions to the private sector as a short-cut for circumventing the judicial system and achieving speedy content removal. The seriousness of terror attacks should not be used to allow the EU and its Member States carte blanche to restrict the rights and freedoms that counter-terror policy seeks to protect from extremists.

In its current form, the Regulation should not be enacted. The Commission must work with the Parliament, digital rights groups, and affected companies to take their concerns on board and put fundamental rights at the heart of any action taken. If it does not, it risks losing legitimacy in the eyes of young, open-minded people who see a free Internet as a key part of a free society.

The opinions in this article belong to to author only and are not necessarily representative of EPICENTER or its member think tanks. 

EPICENTER publications and contributions from our member think tanks are designed to promote the discussion of economic issues and the role of markets in solving economic and social problems. As with all EPICENTER publications, the views expressed here are those of the author and not EPICENTER or its member think tanks (which have no corporate view).

Blog post tags

Share this content

EPICENTER publications and contributions from our member think tanks are designed to promote the discussion of economic issues and the role of markets in solving economic and social problems. As with all EPICENTER publications, the views expressed here are those of the author and not EPICENTER or its member think tanks (which have no corporate view).

Subscribe

* indicates required

EPICENTER publications and contributions from our member think tanks are designed to promote the discussion of economic issues and the role of markets in solving economic and social problems. As with all EPICENTER publications, the views expressed here are those of the author and not EPICENTER or its member think tanks (which have no corporate view).