Samuel Laurinkari, Head of EU Affairs at eBay Inc.
Under the leadership of President-elect Ursula von der Leyen, the incoming EU Commission has committed itself to putting forward a legislative package for digital services, called the Digital Services Act. Based on stated policy objectives of this initiative - most notably Commissioner-elect Sylvie Goulard's written submissions in the European Parliament hearing process - it'll be a mammoth exercise and probably the most important thing for the new Commission to get right in the digital policy area.
As part of the legislative package, the Commission wants to address at least A) platform liability, B) platform transparency, C) platform fairness, D) protection of fundamental rights, E) market openness, F) non-discrimination, G) algorithmic decision making, H) data access, I) digital advertising, and J) the working conditions of platform-enabled independent workers.
Drafting regulation for any of these areas is massively complex and requires not only balancing different interests and objectives but also enormous political courage to look beyond the politics of today and push through sustainable forward-looking policy choices.
Probably the politically most sensitive area is point A (platform liability). Essentially, the question here is whether platforms are legally liable for their users’ conduct – whether for hate speech on social media, fraud on online marketplaces, or copyright infringing content on video sharing platforms. Rules for the current horizontal platform liability framework are laid down in the e-Commerce Directive, the legal cornerstone of today’s platform-based internet. According to the e-Commerce Directive, hosting platforms are not liable for their users’ infringements in case they either do not have actual knowledge of the infringement or they act expediously to remove or disable access to the infringing information. Furthermore, the e-Commerce Directive prohibits Member States from imposing a general obligation on hosting platforms to monitor the information they host or to actively seek for infringing activity.
The previous Commission decided not to re-open the horizontal liability framework but rather pursue targeted, issue-driven initiatives. For example, the Commission’s legislative initiatives on the Copyright Directive, the Terrorist Content Regulation, or the Explosive Precursors Regulation each included provisions either complementing the liability regime of the e-Commerce Directive or derogating from it.
The rationale behind that strategic choice was to recognize that different types of platforms and different types of illegal content require different types of policy responses. What works for terrorist content on social media might not work for unsafe products on online marketplaces. Also Member States pursued a number of national legislative initiatives following the same logic.
The liability regime of the e-Commerce Directive is often criticized for being outdated (dating back to 2001), having been drafted for a set of very different types of hosting service providers. As a result, its fundamental principles are increasingly being disregarded or derogated from in policy and case law, leading to increased fragmentation and legal uncertainty.
So what should the Commission do?
The Commission’s policy choice with regard to platform liability should be based on a couple of fundamental premises.
Premise 1: The core principles of the horizontal liability framework of the e-Commerce Directive still make sense in the context of today’s platform-based internet. It makes sense that platforms are not legally liable for their users’ wrongdoings as long they act expediously when they become aware of an infringement. It makes sense that platforms do not need to manually monitor user activity for infringements. These are fundamental framework conditions for operating a platform that allows users to offer and share ideas, information, news, pictures, services, goods, and videos at scale – hugely benefitting the society and economy. Importantly, maintaining these principles does not preclude the introduction of reasonable and proportionate proactive obligations for hosting platforms.
Premise 2: “Sectorial”, issue-driven initiatives complementing the horizontal liability framework are here to stay. As noted above, different types of hosting platforms and different types of illegal content require different policy solutions. Specific rules of specific types of illegal content provide legal certainty and allow hosting platforms and related interest groups (authorities or private interest groups) to cooperate based on a granular legal framework. Furthermore, targeted solutions serve to prevent unintended consequences through overbroad horizontal initiatives.
Based on these premises, the Commission’s Digital Services Act proposal should put forward a platform liability framework that
- Increases legal certainty for hosting platforms’ proactive measures by introducing a “good Samaritan” principle
- Foresees reasonable and proportionate proactive measures, including appropriate safeguards
- Maintains the core principles of current liability framework and strengthens the value of those principles by re-legitimizing the rules in the legislative process