Content Moderation (IML) in the European Union
Legal and Policy Frameworks Impacting Content Moderation in Europe
As the US looks to reform Section 230, tracking the policies abroad is essential. European countries and the EU face similar policy questions but with a different legal and governance framework. In this second part of our ongoing series on intermediary liability (IML), we will review the policy and legal frameworks that impact content moderation in Europe.
E-Commerce Directive
The European Union’s approach to intermediary liability (IML) as first established in Directive 2000/31/EC, also known as the E-Commerce Directive (ECD). Like Section 230, the ECD provides a liability shield for “passive” online services offering the following:
- Mere conduit – the transmission or temporary storage of third-party content. A mere conduit cannot initiate the transmission of information, select the receiver of the transmission, or select or modify the information contained in the transmission.
- Caching – the automatic, intermediate, and temporary storage of information for the sole purpose of more efficiently facilitating the information’s transmission to other recipients of the service upon their request.
- Hosting – the storage of information provided by the recipient of a service.
Importantly, IML protections provided by the ECD do not extend to online services playing a more active role in content organization, as many social media platforms do. The law also prohibits EU member states from imposing general information monitoring obligations on information society service providers, which are defined as a “service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services.”
While the ECD does not contain an explicit distinction between “active” and “passive” hosts, Articles 13 and 14 each contain an “actual knowledge” provision specific to caching and hosting services; online services acting in these capacities must remove illegal content “as fast as possible” once they are aware of it. The ECD’s more general requirement that information society service providers’ activities be of a “mere technical, automatic, and passive nature” was elaborated on by the Court of Justice of the European Union (CJEU) in the Google France and L’Oréal v. eBay cases. The court drew distinctions between active and passive hosts based on Recital 42 of the ECD, clarifying that intermediaries become active hosts and lose liability protections once they take steps to frame how information is presented to users. The CJEU has ruled separately that service providers may open themselves up to liability if they become aware of facts or circumstances which should have given them knowledge of illegal activity on their platforms.
Digital Services Act
In the face of new challenges brought on by the rise of large online platforms, the European Commission (EC) under the leadership of Ursula von der Leyen has put forward a proposed Digital Services Act (DSA) to update ECD with new provisions regarding illegal content, transparent advertising, disinformation, and intermediary liability. Fresh off its success with implementing the General Data Protection Regulation (GDPR) in 2018, the EU is not only hoping to build on its technology policy frameworks but also to establish global standards and export its model to other countries, as had happened with the GDPR. The DSA preserves the ECD’s provisions on mere conduit, caching, and hosting while adding a suite of new obligations for online service providers, including:
- Compliance with Member States’ orders to act against illegal content on its platform and provide information the online service has collected on its users.
- Due diligence obligations establishing points of contact for Member states, designating a legal representative within the EU, publishing annual content moderation reports, and creating internal complaint-handling systems.
- Suspension of users that frequently post illegal content
- Transparency for advertisements
Notably, the European Parliament’s draft of the DSA also introduces a tiered responsibility framework imposing varying levels of responsibilities for different types and sizes of services, including intermediary services, hosting services, online platform services, and very large online platforms (VLOP) services. VLOPs are defined as having at least 45 million average monthly users in the EU. Failure to comply with the obligations listed in the DSA would impose penalties of up to 6% of an online service’s annual income or turnover, and up to 1% for submitting “incorrect, incomplete, or misleading” information.
Public consultation on the initiative extended from June to September 2020, and the EC tabled the proposed DSA in December of that year to update the current EU legal framework governing digital services. The responsible body for the DSA in the European Parliament, the Internal Market and Consumer Protection (IMCO) Committee, appointed MEP Christel Schaldemose as rapporteur for the proposal in January 2021, and a draft report on the DSA was presented by her in late May. After several rounds of debates, hearings, and draft opinions from various aspects of civil society, the IMCO Committee voted in favor of the DSA. The full Parliament is expected to vote on the amended DSA in January. In parallel, the Council of the European Union finalized its position on the DSA in November 2021, proposing several amendments including:
- Explicitly including online search engines as covered entities
- Enhancing protections for minors online
- Requiring all hosting services – not just online platforms – to notify the suspicion of serious illegal activity from online platforms to all hosting services
- More detailed provisions on the “compliance function” that VLOPs or very large online search engines (VLOSEs) must establish
- Establishing a governance framework that defines the authorities of national regulators, the European Commission, and a new European Board for Digital Services.
In early 2022, the European Parliament, Council of the European Union, and European Commission will enter interinstitutional negotiations towards producing a final draft of the DSA. The Commission is also finalizing an updated Code of Practice on Disinformation, which is a voluntary agreement between the EU, tech companies, and other actors, on what measures they will put into place to combat mis- and disinformation. This effort is seen as a precursor to what might ultimately be legally required once the DSA is passed.
Lastly, the United States has become an actor in these negotiations as the Biden administration and Congress have expressed concern to EU officials that the regulations disproportionately target American tech companies. US officials have also said that they are worried the measures could present security risks such as “sideloading” where apps could be distributed outside of closed systems such as Apple’s App store.
National Initiatives in Europe
At the Member State level, several European countries have passed or are considering their own IML regimes. Perhaps the most well-known of these is Germany’s Netzwerkdurchsetzungsgesetz, or NetzDG, which was passed in 2017. In stark contrast to the liability protections afforded by Section 230, NetzDG is an example of IML legislation that holds online platforms responsible for content published on its platform – specifically, hate speech and disinformation. The law requires providers of online social networks with at least 2 million members to establish a transparent procedure for dealing with complaints about illegal content, check said complaints immediately, delete “obviously illegal” content within 24 hours, and remove any other illegal content within 7 days. France passed a similar law, titled Fighting hate on the internet, in 2020 requiring social media sites operating in France to remove offensive content within 24 hours of notification or face an initial fine up to €1.25 million; repeated offenses could lead to fines of up to 4% of global revenue. The law goes even further than NetzDG by requiring illegal content related to child pornography and terrorism to be removed within 1 hour of being flagged, although the French High Court struck down many of these key provisions one month after its passage. In Ireland, the government approved additional provisions and finalized its own Online Safety and Media Regulation Bill in December 2020. The law provides for the appointment of an Online Safety Commissioner to oversee a new regulatory framework for online safety.
In the United Kingdom, the current government under Boris Johnson is looking to strengthen its IML regime through a proposed Online Safety Bill. The UK does not currently have legislation specifically addressing intermediary liability; the closest thing to it is the 2013 Defamation Act, under which online service providers are only held liable for defamatory third-party content published on their platforms if the claimant can demonstrate that “it was not possible for the claimant to identify the person who posted the statement,” “the claimant gave the operator a notice of complaint,” and “the operator failed to respond to the notice of complaint”. The UK government published its draft Online Safety Bill in May 2021, although UK Secretary of State for Digital, Culture, Media, and Sport Nadine Dorries has stated that her ministry is currently revising the text with tougher sanctions, to be introduced by March 2022. Its provisions include:
- Giving the relevant Secretary of State the power to designate various types of harmful content, such as trolling, online fraud, and illegal pornography.
- Creating a “Duty of Care” for online platforms to act against both illegal and legal-but-harmful content, subject to fines of up to £18 million or 10% of their annual turnover, whichever is higher.
- Establishing a tiered system to impose additional requirements for the very biggest platforms, referred to as “Category One”.
- Empowering the UK’s Office of Communications (Ofcom) to block access to particular websites.
- Obliging social media platforms not to remove, and preserve access to, content of journalistic or “democratic importance”, such as user comments on online political content.
Share
Read Next
Support Research Like This
With your support, BPC can continue to fund important research like this by combining the best ideas from both parties to promote health, security, and opportunity for all Americans.
Give NowRelated Articles
Join Our Mailing List
BPC drives principled and politically viable policy solutions through the power of rigorous analysis, painstaking negotiation, and aggressive advocacy.