Back to main site

    EU Approach to Intermediary Liability

    Module 3: Content Restrictions and Intermediary Liability

    The DSA is the EU’s effort at combatting unlawful speech on the Internet. Political agreement on the DSA was reached in April 2022 between the European Parliament and EU Member States. It entered into force in November 2022, but application of the provisions only began in February 2024.(1) The DSA contains a common set of rules on responsibilities and accountability for providers of intermediary services and online platforms. It also aims to harmonise the legal frameworks in member states and provide protection to all Internet service users by setting out notice-and-action procedures for illegal content, and the possibility to challenge platform content moderation decisions.(2)

    The DSA is applicable to ‘intermediary services offered to recipients of the service that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services have their place of establishment.’ (The scope of application of the legislation is intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services.(3) This means that the DSA is not applicable to individuals that for example run a blog or discussion forum or allow discussions on their Facebook account or other platforms that create content or that are set up for the purpose of publishing user-generated content.(4) However, this regulation is important for platform administrators because where they fail to remove content alleged to be unlawful, a request for removal of that content can be made to the service provider of that platform.

    The ECD Approach

    The transnational nature of the Internet and the publication of content can cause problems, as speech is published in one state from servers in another state. This was an issue in the well-known case of Glawischnig-Piesczek v Facebook Ireland Limited.(5) The claimant was a prominent politician. The defendant, Facebook Ireland Ltd., was described as the operator of a global social media platform for users located outside the USA and Canada.”(6)

    In April 2016, an anonymous Facebook user shared an article from the Austrian online news magazine oe24.at titled ‘Greens: Minimum income for refugees should stay’ and published a comment calling Glawischnig-Piesczek “miese Volksverräterin” (lousy traitor), “korrupten Trampel” (corrupt bumpkin) and her party a “Faschistenpartei” (fascist party). This generated a thumbnail on Facebook containing the title of the article, and a photograph of Glawischnig-Piesczek. Both the post and comment could be accessed by any Facebook user. On 7 July 2016, Glawischnig-Piesczek asked Facebook to delete the posts and to reveal the user’s identity. After Facebook neither deleted the posts nor revealed the user’s identity, Glawischnig-Piesczek applied for an injunction. She argued that her right to control the use of her own image under the Austrian Law on the protection of copyright had been violated. She further claimed that the defamatory comment, which was posted together with the picture, constituted an infringement of the Austrian Civil Code, which protects people from hate speech.

    Facebook Ireland Ltd. argued that it was governed by Californian law (site of its headquarters) or Irish law (European base) but not Austrian law. Secondly, it referred to its host-provider privileges under the ECD which excludes host-providers from liability for their users’ content. Facebook also alleged that the impugned comments were protected under the right to freedom of expression under Article 10 ECHR.

    The Austrian court ordered Facebook to ‘cease and desist from publishing’ the photograph if the accompanying text ‘contained the assertions, verbatim and/or using words having an equivalent meaning’ to the defamatory comment. Facebook Ireland disabled access to the said content in Austria. On appeal, the court upheld the order ‘as regards the identical allegations’ but held that the ‘dissemination of allegations of equivalent content had to cease only as regards those brought to the knowledge of Facebook Ireland by the applicant or by third parties’.(7)

    The Courts agreed that the defamatory comments implied she was engaged in illegal activities without providing any evidence and therefore, were harmful to Glawischnig-Piesczek’s reputation. Both parties appealed this judgment to the Supreme Court. It referred to the CJEU the questions of:

    1. Whether, under Article 15 of the Directive, an injunction against a hosting provider could extend to statements that are identically worded and/or have equivalent content; and
    2. If such an injunction could apply worldwide.

    The CJEU found that the ECD does not preclude a Member State from ordering a hosting provider to remove or block content that has been declared unlawful, or content that is identical or equivalent to such unlawful information. The Court also held that the Directive does not preclude Member states from ordering such removal worldwide, and therefore left it to the Member States to determine the geographic scope of the restriction within the framework of the relevant national and international laws. The Court found that monitoring for identical content to that which was declared illegal, would fall within the allowance for monitoring in a “specific case” and thus not violate the Directive’s general monitoring prohibition. This allowance could also extend to equivalent content providing the host was not required to “carry out an independent assessment of that content” and employed automated search tools for the “elements specified in the injunction.”

    The judgment has major implications for online freedom of expression around the world. The judgment means that Facebook would have to use automated filters to identify social media posts that are ‘identical content’ or ‘equivalent content’. Technology is used to identify and delete content that is considered illegal in most countries, for example, child abuse images. However, this ruling could see filters being used to search text posts for defamatory content, which is more problematic given that the meaning of text could change depending on the context. Compelling social media platforms like Facebook to automatically remove posts regardless of their context infringes free speech rights and restricts access to online information. One of the main concerns with the judgment was that it did not appreciate the limitations of technology when it comes to automated filters.

    A further concern was that the judgment meant that a court in one EU member state could order the removal of social media posts in other states, even if they are not considered unlawful there. This would set a dangerous precedent where the courts of one country can control what Internet users in another country can see. This would allow for abuse, particularly by regimes with weak human rights records.

    The DSA Regime

    The case of Glawischnig-Piesczek v Facebook Ireland Limited was decided pursuant to the ECD. The DSA will continue to apply the hosting, caching, and mere conduit defences that first appeared in the ECD.

    This includes prohibiting general monitoring obligations from being imposed on intermediary service providers and preserving the existing ‘notice and takedown’ process – where a hosting provider will only become liable for illegal content if they have actual knowledge of the unlawfulness and fail to remove or disable access to the content expeditiously.(8)

    Under the DSA a clearer line is drawn between the liability of online platforms and their liability under consumer law. Online platforms, such as marketplaces, will remain liable under consumer law when they lead an ‘average consumer’ to believe that the information, or the product or service that is the object of the transaction, is provided either by themselves or by a recipient of the service who is acting under their authority or control.(9) This will be the case, for example, where an online platform withholds the identity or contact details of a seller until after the conclusion of the contract between that seller and the consumer, or where an online platform markets the product or service in its own name rather than in the name of the seller who will supply that product or service.(10)

    The meaning of ‘average consumer’ was considered by Advocate General Szpunar in the Louboutin case.(11) The Advocate General’s opinion suggests that the marketplace will be liable where a ‘reasonably well-informed and reasonably observant internet user’ perceives the offer of the seller as an integral part of the commercial offer of the marketplace.(12)

    Where an intermediary service provider automatically indexes information uploaded to its service, has a search function, or recommends information based on the preferences of the users, it will not be a sufficient ground for considering that provider to have specific knowledge of illegal activities carried out on that platform or of illegal content stored on it.(13)

    Maintaining the hosting defence and other intermediary protections is positive but online platforms will now be subject to significant new obligations under the DSA.

    Providers of Intermediary Liability

    All intermediary service providers (including those only providing mere conduit and caching services) must comply with the following requirements:

    • Reflecting the fact that some service providers can be difficult to identify and contact, they must provide a public ‘point of contact’ so they can be contacted by other authorities and users.
    • If a service provider is based outside the EU (but offers services in the EU) it must appoint a legal representative in the EU. This sounds similar to the EU representative concept in the General Data Protection Regulation (GDPR).(14) However, there is no exemption for small companies.[16] In addition, under the DSA, that representative can be held directly liable for breaches. Given the potentially punitive sanctions (section 6 below), this is not a role to be undertaken lightly. It is not clear if there will be a ready (or cheap) pool of people willing to take on this role, a matter which is highly problematic given the very large number of intermediary service providers subject to this obligation.
    • The ISP must set out in their terms and conditions any restrictions on the service, alongside details such as content moderation measures and algorithmic decision making.
    • The ISP must issue an annual transparency report on matters such as content moderation measures and the number of take down and disclosure orders received.
    • Service providers that receive take down or information disclosure orders from judicial or administrative authorities in the EU must notify the authority of any action taken.

    Providers of Hosting Services

    Hosting services are a subset of intermediary services consisting of the storage of information provided by or at the request of a user, such as cloud service providers, online marketplaces, social media, and mobile application stores.

    In addition to the above, hosting providers are subject to additional obligations:

    1. Anyone should be able to notify the hosting provider of illegal content (not just judicial or administrative authorities). The hosting provider must process that notice diligently and report back on whether the content was removed.
    2. Hosting providers must notify users if they remove content. This also includes demoting or restricting the visibility of the content and the notification should include details of whether the decision was taken using automatic means (e.g. based on machine learning classifications).
    3. Hosting providers must inform the judicial authorities if the hosted content creates a suspicion that a criminal offence has occurred, limited to offences involving a threat to life or safety.

    New provisions in the DSA applies to online platforms such as social media services and online marketplaces. Any attempt to regulate user-provided content is fraught with difficulties and raises difficult questions about the balance between fundamental rights to freedom of information, the impact of online harms and the practical limitations attempting to moderate content at scale.

    The DSA takes a generally back seat role. Except for large platforms there are limited obligations to oversee content on the platform. Instead, the new regime appears to have more of a bias to protect content by giving users a right to complain against the removal of content, and even use an out-of-court appeals process if they are unhappy with the platform’s handing of that complaint. This is a significant change for many platforms who will have to be much more transparent about their moderation processes and may need significant additional resources to deal with subsequent objections and appeals from users.

    Alongside these changes are other significant developments, including:

    • Platform providers cannot use interfaces that manipulate or distort the choices taken by users – in addition to those forms of manipulative practices that are already set out in the Unfair Commercial Practices Directive(15)and the GDPR.(16)
    • Suspension of repeat offenders:Where a user continues, after being warned, to ‘frequently’ provide unlawful content, the platform provider must suspend them for a reasonable time.
    • Disclosure of monthly active users: The platform provider must disclose the number of monthly active users in the EU. Advertising and recommender system transparency:Online platforms shall not present advertising to users based on profiling with special category data. The platform provider must provide users with information about advertisements they are shown including the reasons why that advertisement was selected for them. Where an advertisement is based on profiling, the platform provider must also inform the user about any means available for them to change such criteria. Similarly, the platform provider must be transparent about the operation of any recommender system.
    • Seller verification: The platform provider needs to ensure seller on the platform identify themselves and make best efforts to verify certain traceability information before allowing them to use their platforms.
    • Online protection of minors: Providers of online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors.

    The highest tier of regulation applies to:

    1. Very large online platforms (VLOP): These are very large online platforms which have over forty-five million monthly active users in the EU, a number equivalent to 10% of the EU population, and are designated as such by the Commission.
    1. Very large online search engines (VLOSE): These are online search engines which have over forty-five million monthly active users in the EU and are designated as such by the Commission.

    This designation brings with it some of the very strongest obligations in the DSA, considering the overall influence of such platforms. This includes obligations to conduct a risk assessment of their services and to take steps to mitigate any risks identified as part of that process.

    Also, the DSA operates by putting in a baseline ’notice and takedown’ system. Hosting providers (including online platforms) must allow third parties to notify it of any illegal content it is hosting. Once notified, the hosting provider will need to remove that content expeditiously to continue to benefit from the hosting defence. Added to that, online platform providers must provide an expedited removal process for notifications from trusted flaggers, suspend users who frequently post illegal content and provide additional protection to minors.

    Alongside these protections, VLOP and VLOSE have specific obligations to assess and mitigate ‘systemic risks’ arising from their services. That assessment must include the risks of or to:

    1. Illegal content: This encompasses a wide range of harmful material including hate speech.
    2. Fundamental rights: This applies where content would impact on the exercise of fundamental rights, such as freedom of expression, privacy, the right to non-discrimination and consumer protection. Importantly, this does not just mean removing content but also actively supporting free speech by taking measures to counter the submission of abusive take down notices.
    3. Democracy: This encompasses negative effects on the democratic process, civic discourse, and electoral processes, as well as public security.  

    Finally, this framework will provide extra protection for recognised media sources through the proposed Regulation establishing a common framework for media services (European Media Freedom Act).(17)This requires VLOP to allow recognised media sources to declare their status and imposes additional transparency and consultation obligations on VLOP in relation to the restriction or suspension of content from those sources.

    Footnotes

    1. Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act). Back
    2. Chapter III Section 4 of the DSA. Back
    3. DSA (Article 1(2), 2(1-2) and Article 3((g)(i–iii)). Back
    4. According to DSA 2(2) it is not applicable ‘to any service that is not an intermediary service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary service, irrespective of whether the service is provided through the use of an intermediary service.’ Back
    5. C-18/18 Glawischnig-Piesczek v Facebook Ireland Limited [2016] ECLI:EU:C:2019:821. Back
    6. Ibid., §11. Back
    7. Ibid.,§16 Back
    8. Art. 6(1), DSA. Back
    9. Art. 6(3), DSA. Back
    10. Recital 24, DSA. Back
    11. Opinion of Advocate General Maciej Szpunar (2 June 2022), Christian Louboutin v. Amazon, Joined Cases C‑148/21 and C‑184/21, ECLI:EU:C:2022:422, paras 65-72. Back
    12. Ibid., §101. Back
    13. Recital 22, DSA. Back
    14. Art. 27(2), GDPR. Back
    15. Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (Unfair Commercial Practices Directive). Back
    16. European Data Protection Board, Guidelines 3/2022 on Dark patterns in Social Media Platform Interfaces: How to Recognize and Avoid Them (adopted on 14 Mar. 2022). Back
    17. Regulation of the European Parliament and of the Council establishing a common framework for media services in the internal market (European Media Freedom Act) and amending Directive 2010/13/EU (2022/0277 (COD)). Back