Close

France, Germany and Illegal Content Online: Where to, Europe?

Regulating big tech is complex. But establishing harmonized and consistent tech regulation across Europe is even harder and the recent developments around online hate speech prove that. While the French constitutional court rejected, on June 18, a new law combating hate speech online, Germany is determined to double down on the issue by strengthening its existing legislation. These diverging national developments deepen the conundrum European lawmakers are facing in the digital realm, making it harder for the European Commission, the EU’s executive arm, to design its new key digital legislation on platform liability: the Digital Services Act (DSA). The DSA aims at reviewing the EU e-Commerce Directive (2000), an equivalent to Section 230 of the Communications Decency Act in the United States, by further harmonizing the regulatory framework for online platforms, including tackling illegal content online.

A tale of two regulations: the French Avia Law and the German NetzDG

Germany’s Network Enforcement Act, commonly known as NetzDG, regulates how social networks handle illegal content such as hate speech, slander, defamation, public incitement to criminal offences and more by making them liable to fines if they fail to remove such content. It was introduced in 2017 after a task force with social media companies did not produce the desired outcome by regulators. A similar pattern occurred in France, where the government started with an experiment of co-regulation, envisioned by Emmanuel Macron and Mark Zuckerberg. But the initiative ended up as a general report on social networks regulation, and a bill targeting content moderation, inspired by NetzDG and the EU proposal on Online Terrorist Content Regulation (2018)—broad text aimed at limiting the spread of terrorist material on online platforms -, was introduced under the name of Avia law, after the name of its rapporteur, in early 2019.

Both laws contained the same flagship measure making platforms liable to substantial fines if they failed to delete illegal content within 24 hours. In a letter addressed to the French government, the European Commission warned about the risks of this measure to freedom of speech and fundamental rights, picking up arguments already made by digital rights activists and platforms themselves in France and Germany. The French constitutional court used a similar reasoning to shatter the Avia law in June 2020, arguing that the measure disproportionately infringed on free speech, notably because it would incentivize platforms to block suspicious but lawful content to avoid fines, a practice described as “over-blocking.” The court also condemned the absence of judiciary control along the process, which equates to granting online platforms quasi-judicial power in the assessment of content legality.

In the perspective of the upcoming DSA, the French ruling makes it a real challenge for European policymakers to design a common approach, as Germany is moving in the opposite direction by tightening its laws. The same day the Avia law was rejected, the German parliament passed the so-called “law to combat right-wing extremism and hate crime” which also amends the NetzDG, introducing an obligation for social networks to report illegal content flagged by users directly to law enforcement authorities. In addition, the German parliament is currently examining an amendment to the NetzDG which would increase the transparency of social networks and strengthen the rights of users by simplifying the reporting procedure and making it easier to appeal against unjustified deletions.

Where is Europe heading?

By their domestic initiatives and their action at EU level, and given their prominence as two of the leading and more powerful countries in the EU, France and Germany were instrumental in pushing the European Commission to work on the proposal for a Digital Services Act. With the recent French ruling and the strengthening of the NetzDG, the Commission will now have to find the right balance to modernise its platforms’ liability framework. The DSA proposal, to be presented by the end of the year, will be a fundamental overhaul of the EU internet regulatory framework, harmonising rules on the removal of illegal and harmful content across the EU. More broadly, the DSA will assess the responsibilities of online intermediaries vis-a-vis the content users share, and is curated, on their services.

France and Germany have been long time proponents of “hard” obligations for online platforms, such as strict removal timeframes, but the recent French ruling will now certainly change the French approach. The government could refocus on what the European Commission and the European Parliament seem to lean towards: a list of measures platforms will need to take to mitigate the risk of illegal content and hate speech being posted, shared and spread online. The Commission seems to favor this approach that represents a compromise between the absence of binding rules and the hard line of the German law, as the latter may threatens freedom of speech by putting excessive pressure on platforms. To this end, Germany might be more isolated to defend its approach at EU level.

This will prove even more difficult as a group of 10 “digital like-minded” countries that includes the Netherlands, Poland and Sweden, known as the Digital 9+ group, has recently published a paper calling for a “light-touch” approach, which would introduce an EU-wide notice and action framework that “enables swift and effective removal of the clearly illegal content” supported by voluntary-based proactive measures. The group will seek to limit the introduction of additional constraints on platforms. While France won’t certainly back anymore the German hard line because of the recent ruling, both governments will still support ambitious obligations  for platforms, such as higher scrutiny (with potential auditing rules), strict removal mechanisms and transparency measures that the Commission floats in its on-going DSA consultation. Heated discussions on the matter will most certainly take place, not forgetting that the European Parliament, which will be a co-legislator and have an equal role as Member States in the discussions, is expected to support ambitious rules on the respect of fundamental rights online.

EUHate SpeechSocial MediaTech PolicyTechnology

Close