#Article 6: Committee of the European Parliament rejects upload filters in the fight against terrorist propaganda

According to the EU Commission and the Member States, terrorist propaganda on the internet is a big problem. An EU regulation is intended to sweep such content out of the network to protect citizens, but at the same time jeopardizes freedom of expression and information. Public domain-similar released by unsplash.com Oscar Keys Paragraphs or articles of a law seldom manage to become a well-known buzzword - most recently, for example, § 219a or the article 13 from the EU Copyright Directive. The latter is currently providing demonstrations across Germany because the upload filters it contains should check each content for legality before it can appear on an online platform. "One shoots with the shotgun article 13 on Youtube and Facebook and meets unfortunately still half the InterNet", warned the netzpolitik.org editor-in-chief Markus Beckedahl lately against the fire-dangerous side effects of the law.

Less well known, however, is the article 6 from the currently negotiated draft regulation of the European Commission, which is to prevent the spread of terrorist content on the Internet. This too provides for upload filters - except that these would not only affect certain platforms, but can be made mandatory for all service providers operating in Europe. In addition, all online services, whether a large platform like Facebook or a small blog like netzpolitik.org, would have to respond to removal orders within an hour to erase user-alleged, suspected terrorist content.

Land damage for the freedom of expression and information

That would not let half of the Internet but the entire Internet under the wheels and restrict the right to freedom of expression and information sensitive. For one thing, "terrorism" is a fuzzy term that could be applied to actions of civil disobedience such as the clearing of the Hambach Forest. On the other hand, automated filtering systems tend to error because they do not assess the context of content and, for example, in scientific or journalistic reporting strike. And providers could in case of doubt prefer too much rather than too little to avoid the impending fines of up to four percent of the worldwide annual turnover.

No wonder the proposal meets with fierce resistance. He comes from among others civil society, the European Union Agency for Fundamental Rights, the Internet economy, UN Special Rapporteurs - and now also the European Parliament, which is currently clarifying its position before entering into negotiations with the Commission and the Council. On Monday, the Committee on Internal Market and Consumer Protection of the European Parliament (IMCO), as the first advisory committee, adopted its report and overwhelmingly opposed the prior checking of all content.

Targeted instead of automated measures

Instead of using so-called "proactive measures" in the fight against suspected terrorist content on the Internet - that is, artificial intelligence-based upload filters that detect such content independently in advance and prevent it from uploading to online platforms - the report proposes targeted and "Specific measures". He also clarifies that the rules "should not include automated content filters or other measures that involve systematic monitoring of user behavior" (our own non-legal translation from English).

In addition to the newly drafted article 6, the report prepared under the leadership of the rapporteur Julia Reda (Pirates / Green Group) also defuses the short deadline, which is barely achievable, especially for smaller providers. Instead of a hard limit of one hour, the IMCO proposal gives providers at least eight hours to "comply" promptly with deletion requests.

This not only protects smaller providers, but also makes the regulation compatible with the e-commerce directive. It prohibits Member States from imposing general surveillance obligations on providers and also regulates the "Notice & Takedown" procedure. This set of rules, which forms the basis for the legally compliant operation of online services, has come under fire more and more recently and should be revised in the coming legislative period. But until that happens, more and more EU laws are increasingly shifting liability issues to the platform operators by undermining this principle. And that in turn leads to the so-called "privatized law enforcement", which eludes any democratic control.

Time is running out

The opinion of the IMCO committee is now the first official signal that the European Parliament could not rush the regulation through a fast-track procedure - even if the committee is solely responsible for the position of MEPs in an advisory and non-executive capacity. The main negotiator of the Parliament in the Committee on Civil Liberties, Justice and Home Affairs (LIBE), conservative Briton Daniel Dalton, is confident that he will be able to tie up the LIBE report during this legislative period - the vote for the final LIBE report remains for the 21. March set.

But that would only be the starting signal for the trilogue, ie the negotiations between the Commission, the Council and the Parliament. It is not yet possible to estimate how these will proceed, but it will depend to a large extent on the content of the LIBE report. And whether the deputies will bow to the pressure of the Commission, which can not go fast enough, both in the adoption of the law and in the removal of possibly illegal content.

For in addition to article 6 with its upload filters, the articles 4 and 5 are problematic, which regulate the process for a possible removal of content. Both articles do not provide for independent judges control, but rely on the assessment of the investigating authorities or even the private platform operators themselves, whether or not certain content is illegal. Here, too, the IMCO report is improving and at least proposing a retrospective, constitutional review of distance orders. This is to encourage the dispatch of such messages, to work carefully, it says from the parliament.

The EU Commission evidently has no such concerns. For example, Commission official Hans Das of the Directorate-General for Migration and Home Affairs said at an event on the proposed Regulation yesterday: "Online there is so much terrorism propaganda recycled so quickly. It would be totally irrational to give up the evidence [whether it is actually illegal content] to the investigating authorities and the courts. "

Help! With your financial help you support independent journalism.


Created on:7. March 2019