Platforms: The Future of Notice & Takedown in Europe

Data is not the new oil. But drilling platforms like these can still serve as a metaphor for online platforms. CC-BY 4.0 Divulgação Petrobras / ABr Kirsten Fiedler is Senior Policy and Campaign Manager at European Digital Rights (EDRi), a civil society digital rights organization based in Brussels. Kirsten is a longtime author at netzpolitik.org.

In Brussels' corridors, the market power and dominance of the data companies Google and Facebook causes ever greater displeasure. The question of how such platforms should be regulated and especially how they deal with supposedly illegal content, fills entire conference rooms. There are still no concrete proposals, but it has been rumored for several months that the e-commerce rules in Europe are to be changed. Platform regulation is more than just a buzzword: the issue is taking shape and could indeed become the next major lobby battle.

What it's all about

The e-commerce directive (2000 / 31 / EC) of the European Union regulates central questions around the electronic business traffic and above all the liability and responsibility of Providern.

The policy has provided a secure haven for service providers for more than 15 years. It requires that, under certain conditions, providers are not responsible for the content users upload and share with them. The rule has helped the Internet to develop in a revolutionary way. They do not require providers to fear being dragged to court for any unlawfulness of their users.

The Directive explains (recital 40):

"Under certain conditions, service providers are required to act in order to prevent or eliminate unlawful activities. The provisions of this Directive should provide an appropriate basis for the development of rapid and reliable procedures for the removal and blocking of unauthorized information. "

What is "Notice & Takedown"?

The E-Commerce Directive has indirectly set up a so-called "Notice & Takedown" procedure in Europe. According to article 14 of this guideline, providers can benefit from a liability exemption if they remove or deactivate access to information as soon as they become aware of their unlawful nature. The rules apply to any kind of illegal or unlawful content.

In addition, Article 15 of the policy states that service providers may not be appointed Internet police - in any case, they must not be forced to provide general, active monitoring of all content.

Despite widespread use, the term "notice and takedown" is just one of many mechanisms a provider can take. More appropriate would actually be "Notice and Action", as the term encompasses the various procedures for removing illegal or infringing content from its platforms on the basis of a received message. Providers can respond to messages in different ways. They can either act immediately and block or block the content, or wait for a response from the user and respond accordingly after receiving a defense by counter-notification.

Fundamental rights and collateral damage

Notice and action mechanisms have a direct impact on freedom of expression as they govern how content is removed or blocked from the network. Providers have been expected since the e-commerce policy to decide on competing rights and interests. Of course, this is problematic in that private companies are not qualified to replace courts in such an important task. This is often referred to as "privatized law enforcement".

If platforms refuse to delete content, they run the risk of being held liable for the content. Profit-oriented companies want to minimize costs and avoid litigation and legal proceedings wherever possible. Therefore, a platform is likely to use a method that deletes too much rather than too little. It is not surprising that in many cases the examination of the possible illegal nature of the content and the consideration of the rights involved are minimal. This should lead to the preventive override of completely legitimate content, also called "Overblocking".

After all, the internal processes of the large platforms are intransparent. This makes it impossible to objectively analyze how effective or precise the measures they have created are.

Where the journey is going

Since the e-commerce directive, a few platforms have gained great market power on the Internet, they are playing an increasingly active role and have incredible resources - both technologically and financially. So it is becoming more and more a question of whether the rules on disclaimer are still timely and.

In addition, there is a controversial trend at the EU level that is disrupting the rules of e-commerce: Platforms will likely soon need to take proactive measures to identify and filter content, or block access to it - mostly automated technologies , such as upload filter, means.

In recent years, the EU Commission has proposed a number of regulations that result in automated systems. One example is the copyright reform, but also the revision of the Audiovisual Media Services Directive and the proposal to prevent the spread of online terrorist content. With regard to the protection of freedom of expression, it would be advisable not to abolish the horizontally applicable safe haven for providers - but this is already being undermined vertically by the aforementioned legislative initiatives.

The question now is how the e-commerce policy will be revised. It is clear that a revision also offers opportunities. So far unclear terms could be clarified, it could finally be harmonized and more legal certainty created. It would finally be possible to introduce a much clearer, graduated system for Europe that respects freedom of expression and respects proposals from civil society and academia. An example of this is the Manila Principles or the Santa Clara Principles.

Help! With your financial help you support independent journalism.