Back to main site

    Intermediary Liability

    Module 5: Trends in Censorship by Private Actors

    Internet intermediaries – an overview

    ‘Internet intermediary’ is a broad, constantly developing term referring to the many services and stakeholders involved in providing access to internet services. The Council of Europe suggests the term encompasses “a wide, diverse and rapidly evolving range of service providers that facilitate interactions on the internet between natural and legal persons.” Their functions include connecting users to the internet; hosting web-based services; facilitating the processing of data; gathering information and storing data; assisting searching, and; enabling the sale of goods and services.(1)

    Examples of internet intermediaries include:

    • Internet service providers who offer connectivity;
    • Web hosting companies that provide the infrastructure;
    • Search engines and social media platforms, that provide content and facilitate communication.(2)

    Simply put, “internet intermediaries are the pipes through which internet content is transmitted and the storage spaces in which it is stored and is therefore essential to the functioning of the internet.”(3) Internet intermediaries dominate a pivotal role in the current digital climate impacting social, economic and political exchanges. They can influence the dissemination of ideas and have been described as the “custodians of our data and gatekeepers of the world’s knowledge”.(4)

    It is not difficult to see the link between internet intermediaries and the advancement of an array of human rights. As gatekeepers to the internet, they occupy a unique position in which they can enable the exercise of freedom of expression, access to information and privacy rights. The 2016 Report of the UNSR noted that:

    “The contemporary exercise of freedom of opinion and expression owes much of its strength to private industry, which wields enormous power over digital space, acting as a gateway for information and an intermediary for expression.”

    Internet intermediary liability

    Given the important roles that intermediaries play in society, with influence on either upholding or infringing on a myriad of implicated rights, it is imperative to understand their legal liability. The Association for Progressive Communications (APC) explains that intermediary liability refers to the extent that internet intermediaries should be held responsible for what users do through their services. Where intermediary liability exists, ISPs have an obligation to prevent unlawful or harmful activity by users of their services, and failure to do so may lead to legal consequences such as orders to compel or criminal sanctions.

    For example, in 2023 the Malaysian Communications and Multimedia Commission (MCMC) announced that it would take legal action against Meta for what it saw as a failure to promptly remove content deemed harmful.(5) This reportedly included matters related to race, royalty, religion, and instances of defamation, impersonation, online gambling, and fraudulent advertisements. Digital rights advocates argued that the MCMC’s threat of legal action against a social media platform for its content moderation decisions poses a potential risk to intermediary liability principles and online freedom of expression.(6)

    In a report on the liability of internet intermediaries in Nigeria, Kenya, South Africa, and Uganda, APC captured the following ways in which intermediary liability can arise:

    • Copyright infringement.
    • Digital privacy.
    • Defamation.
    • National and public security.
    • Hate speech.
    • Child protection.
    • Intellectual property disputes.

    While intermediary liability can be associated with a legitimate interest, there are growing concerns, as noted by the UNSR in the 2016 Report, about the “appropriate balance between freedom of expression and other human rights” and the misuse of intermediary liability to curb expression and access.(7) The legal liability of intermediaries has a direct impact on users’ rights, as intermediaries are more likely to be pre-emptively restrictive, and even prevent lawful activity, to avoid possible legal consequences. In this regard, there is a direct correlation between restrictive liability laws – the over-regulation of content – and the increased censorship, monitoring and restrictions of legitimate and lawful online expression. There are three general approaches to intermediary liability, each with differing considerations and implications: strict liability, the broad immunity model, and the safe-harbour model.

    Strict liability

    In terms of this approach, intermediaries are liable for third-party content. The abovementioned UNESCO report states that the only way to avoid liability is to proactively monitor, filter, and remove content in order to comply with the state’s law. Failing to do so places an intermediary at risk of fines, criminal liability, and revocation of business or media licenses. The UNESCO report notes that China and Thailand are governed by strict liability. This approach is largely considered inconsistent with international norms and standards.

    Strict Liability in China

    The Stanford CIS World Intermediary Liability Map, which documents intermediary laws around the world, has captured the following in relation to China:

    • In 2000, China’s State Council imposed obligations on “producing, assisting in the production of, issuing, or broadcasting” information that contravened an ambiguous list of principles (for example opposing the basic principles as they are confirmed in the Constitution; disrupting national policies on religion, propagating evil cults and feudal superstitions; and spreading rumours, disturbing social order, or disrupting social stability).
    • China has followed through with its strict liability approach and continues to hold internet companies liable if they fail to comply. This has led to wide-scale filtering and monitoring by intermediaries. This level of oversight has resulted in social media companies being the principal censors of their users’ content.

    Broad immunity model

    On the other end of the spectrum is the broad immunity model, which provides exemptions from liability without distinguishing between intermediary function and content. The UNESCO report cites the Communications Decency Act in the United States as an example of this model, which protects intermediaries from liability for illegal behaviour by users when they do remove content in compliance with private company policy. ARTICLE19 explains that under this model, intermediaries are not responsible for the content they carry, but are responsible for the content they disseminate. The Organisation for Economic Co-operation and Development (OECD), in its Council Recommendation on principles for internet policy, makes reference to this as the preferred model, as it conforms with the best practices, discussed below, and gives due regard to the promotion and protection of the global free flow of information online.

    Safe-harbor model

    The safe harbour model, otherwise known as conditional liability, seemingly adopts a middle-ground approach. This approach gives intermediaries immunity provided they comply with certain requirements. Through this approach, intermediaries do not have to actively monitor and filter content but rather are expected to remove or disable content upon receipt of notice that the content includes infringing material. Central to this approach is the idea of ‘notice and takedown procedures’, which can be content- or issue‑specific. There are mixed views on this approach; for some, it is a fair middle-ground; for others, it is a necessary evil to guard against increased filtering or a complete change in the intermediary landscape.(8) As noted in the UNESCO report, there are others who express concern about this approach because of its susceptibility to abuse, as it may lend itself to self-censorship, giving the intermediaries quasi-judicial power to evaluate and determine the legality of content.

    Conditional liability in South Africa

    The Freedom of Expression Institute explains the position in South Africa as follows:

    • Chapter 11 of the South African Electronic Communications Act 25 of 2002 provides for limited liability of internet intermediaries subject to a takedown notice condition. These provisions apply to members of the Internet Service Providers Association. If an ISP receives a takedown notice to remove harmful content, they must respond immediately; failing which their immunity from liability is forfeited.
    • Criticism of South Africa’s framework matches broader concerns about the safe harbour approach: that ISPs err on the side of caution and are quick to remove content without providing the content provider with an opportunity to defend the content, and there are no existing appeal mechanisms for content creators or providers. This is concerning given the fact that any individual can submit a take-down notice.(9)
    • The potential for these mechanisms to be abused became clear in 2019 when an ISP briefly took down the South African news portal Mail & Guardian Online in response to a fraudulent takedown request which appears to have been submitted in retaliation for an investigative report about a convicted fraudster at the centre of a controversial South African oil deal.(10)

    At the core of the debate between the various models is the need to understand the difference between lawful and unlawful content. There is a chilling effect on expression when internet intermediaries are left to their own devices to determine what is good or legal, as it is likely they will tend towards more censorship than less, out of fear of liability.

    Keeping in line with a human rights perspective, this guide advocates that “[t]he right to freedom of expression online can only be sufficiently protected if intermediaries are adequately insulated from liability for content generated by others.”(11) The following section provides some guidance on applicable international human rights frameworks that can be relied on when advocating for rights in relation to intermediary liability.

    Intermediary liability in the courts

    Intermediary liability has been dealt with at some length in the European Court of Human Rights (ECtHR). The groundbreaking case of Delfi AS v Estonia found that an online news portal was liable for offensive comments they allowed to be posted below one of their news articles.

    In Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v. Hungary, however, found that imposing objective liability for unlawful comments made by readers on a website placed “excessive and impracticable forethought capable of undermining freedom of the right to impart information on the Internet.”

    More recently, Media Defence and the Electronic Frontier Foundation (EFF) have intervened in a case at the Grand Chamber of the ECtHR, which concerns online users being held liable for third-party comments. In Sanchez v France a French politician was charged with incitement to hatred on religious grounds following comments posted on the ‘wall’ of his Facebook account by other parties. Because he failed to delete those comments promptly, he was convicted of that offence. The individuals who posted the comments were convicted of the same offence. The Fifth Section of the ECtHR held that his conviction for failing to promptly delete unlawful comments published by third parties on the public wall of his Facebook account did not breach his Article 10 rights despite his apparent lack of knowledge of the comments. The judgment was referred to the Grand Chamber of the ECtHR. In 2023, the Grand Chamber dismissed the application.

    In 2021, the UN Special Rapporteur warned against the trend of states passing regulations and issuing orders to pressure online platforms to police speech, rather than creating rights-preserving processes that can be adjudicated through the courts, noting:

    “The risk with such laws is that intermediaries are likely to err on the side of caution and “over-remove” content for fear of being sanctioned.”(12)

    Different interest groups continue to push different agendas in relation to internet intermediaries and their liability. Many countries either have non-existent laws or vague and inconsistent laws that make it difficult to enforce rights. There are, however, applicable international human rights frameworks that guide how laws should be enacted or how restrictions may be imposed. With any rights-based advocacy or litigation, it is necessary to establish the rights invoked. As discussed above, it is clear that internet intermediaries play a vital role in the advancement of an array of rights. Thereafter, the next step is to determine responsibility.

    In relation to internet intermediaries, the triad of information rights is clearly invoked. The 2010 UN Framework for Business and Human Rights finds that states are primarily responsible for ensuring that internet intermediaries act in a manner that ensures the respect, protection and promotion of fundamental rights and freedoms of internet users. But at the same time, the intermediaries themselves have a responsibility to respect the recognised rights of their users.

    Although there might be complexities regarding the cross-jurisdictional scope of intermediaries’ powers and responsibilities, international human rights norms should always be at the fore.

    Given the link between internet intermediaries and the fundamental right to freedom of expression, it is best to engage with this topic and test laws, regulations and policies against prescribed human rights standards and understand the restrictions and limitations that may be applicable. As discussed in previous sections, restrictions on the right to freedom of expression have been formulated as a strict, narrow, three-part test – namely, that the restriction must:

    • Be provided by law;
    • Pursue a legitimate aim; and
    • Conform to the strict tests of necessity and proportionality.(13)

    Laws content restriction orders and practices must comply with this test. Practically, the need to assess the compliance of legislative frameworks is most likely to be needed in jurisdictions that adopt the strict liability model and the safe-harbour model. The strict liability model can be easily tested and found to be compliant. The safe-harbour model requires slightly deeper engagement to determine compliance, as the following example – namely Kenya’s Copyright (Amendment) Act of 2022 – shows.

    Copyright reform in Kenya

    In 2022, Kenya passed into law the Copyright (Amendment) Act. While the final Act did not deal substantively with intermediary liability, this was due to drafting changes during the public participation process: in its earlier forms, the Copyright (Amendment) Bill, provided some interesting proposals regarding intermediary liability in the African context. A key feature of earlier versions of the Bill was the introduction of the safe-harbour approach, providing for “conduit” safe harbours and “caching” safe harbours. The former would have protected intermediaries from liability for copyright infringements if their involvement was limited to “providing access to or transmitting content, routing or storage of content in the ordinary course of business”.

    Under these circumstances, the intermediary is not under an obligation to take down or disable content if a takedown notice is received. As per (former) section 35A(1)(b), intermediaries would have been protected if their role was related to content storage that is “automatic, intermediate and temporary”. This protection would be conditional upon the removal of content following a take-down notice.(14)

    Civil society criticised the lack of clarity and vague notice-and-takedown procedures in the Bill, noting that it fell short of international standards on freedom of expression. ARTICLE 19 listed five problems with the Bill in terms of notice-and-takedown procedures:

    • Lack of proportionality: criminal sanctions would have been imposed on intermediaries who failed to remove content. As discussed above, this would cause intermediaries to lean toward censorship and blocking, which infringes on freedom of expression.
    • Lack of clarity: the procedures were vague and did not provide clarity on the issue of counter-notices.
    • Lack of due process: there was no mention of judicial review or appeal mechanisms. There was also no requirement to notify the content publisher of the alleged infringement. The 48-hour timeframe for content removal would not have allowed for a counter-notice.
    • Lack of transparency: there was no obligation to maintain records of takedown requests or provide access to such records.
    • Severe sanctions: the harsh sanctions for false takedown notices would have been disproportionate to the purpose of deterring such.

    It is apparent that the necessity and proportionality legs of the test proved to be the sticking points in relation to this Bill. While the safe harbour method might serve a legitimate aim, if the guiding regulations are not clear, necessary, and proportionate, then there is an unjustifiable limitation on freedom of expression. These sections of the Bill were removed, and the Act was passed in 2022 without addressing intermediary liability.

    In 2015, a group of civil society organisations drafted a framework of baseline safeguards and best practices to protect human rights when intermediaries are asked to restrict online content. Known as the Manila Principles, these were drafted with the intention of being “considered by policymakers and intermediaries when developing, adopting, and reviewing legislation, policies and practices that govern the liability of intermediaries for third-party content.” Advocates and litigators should similarly rely on these best practice principles, which are based on international human rights instruments and other international legal frameworks when advancing online rights.

    Manila Principles

    The key tenets of the Manila Principles on Intermediary Liability:

    • Intermediaries should be shielded from liability for third-party content.
    • Content must not be required to be restricted without an order by a judicial authority.
    • Requests for restrictions of content must be clear, unambiguous, and follow due process.
    • Laws and content restriction orders and practices must comply with the tests of necessity and proportionality.
    • Laws and content restriction policies and practices must respect due process.
    • Transparency and accountability must be built into laws and content restriction policies and practices.

    Digital rights advocates have used these principles to test whether states’ legal frameworks and regulations for intermediary liability are adequate. For example, in 2018 India’s ICT ministry published draft regulations that would add new restrictions to that country’s existing intermediary liability, including for example that internet intermediaries should automatically, proactively filter out content that promotes cigarettes and alcohol.(15) The Centre for Internet and Society (CIS) made submissions showing that the draft 2018 Rules were unaligned to the Manila Principles and had the potential to infringe on the right to freedom of expression. At the time of this publication, the provisions in the 2018 Draft Rules had not been put into regulation, and the CIS approach is a useful illustration of how the Manila Principles can be used to test domestic legislation against international best practices.

    However, from 2021 to 2023 the Ministry subsequently proposed new, more extensive changes to the intermediary liability framework.(16) While these do not include the controversial provisions of the 2018 Draft Rules, the changes extend new liabilities to the online game industry and include new restrictions on publishing information which is “patently false and untrue or misleading in nature”. In follow-up submissions, the CIS argued that this effectively requires intermediaries to factcheck any content published through their services, which they argue is unconstitutional.

    The apparent successes in having the draft 2018 Rules withdrawn illustrate the importance of digital rights advocates bringing international law to bear in their policy engagements. Yet the subsequent developments in India’s intermediary liability framework illustrate the ongoing debates and need for further engagement to ensure emerging policies uphold the principles of freedom of expression online.

    Conclusion

    Internet intermediaries play a crucial role in the advancement of human rights. Intermediary liability needs to be understood holistically in relation to the prevention of harm, the protection of free speech and access to information, and encouraging innovation and creativity.(17)

    While there is a growing trend of online harms and unlawful content:

    “The law must find a way to flexibly address these changes, with an awareness of the ways in which existing and proposed laws may affect the development of information intermediaries, online speech norms, and global democratic values.”(18)

    Footnotes

    1. Media Defence, accessible at https://www.mediadefence.org/resources/mldi-training-manual-digital-rights-and-freedom-expression-online. Back
    2. ARTICLE 19, ‘Internet intermediaries: Dilemma of Liability’, 2013, at 3 (accessible at https://www.article19.org/data/files/Intermediaries_ENGLISH.pdf). See further Li, ‘Beyond Intermediary Liability: The Future of Information Platforms’ Yale Law School Information Society Project (2018) at 9 (accessible at https://law.yale.edu/sites/default/files/area/center/isp/documents/beyond_intermediary_liability_-_workshop_report.pdf). Back
    3. Id at 6. Back
    4. Riordan, ‘The Liability of Internet Intermediaries’ DPhil thesis, Oxford University (2013,) at 1 (accessible at https://ora.ox.ac.uk/objects/uuid:a593f15c-583f-4acf-a743 ).62ff0eca7bfe/download_file?file_format=pdf&safe_filename=THESIS02&type_of_work=Thesis). Back
    5. MCMC press statement (2023) (accessible at https://www.mcmc.gov.my/en/media/press-releases/non-cooperation-to-remove-undesirable-contents-fro). Back
    6. ARTICLE 19 ‘Malaysia: Halt legal action against Meta over content moderation’ 2023 (accessible at https://www.article19.org/resources/malaysia-halt-legal-action-against-meta/). Back
    7. A 2014 UNESCO report on fostering freedom online and the role of internet intermediaries provides a comprehensive overview of the above regulatory objectives pursued by the states, which in turn have a direct impact on how, and to what extent, intermediaries are compelled to restrict freedom of expression online. Back
    8. Koren, Nahmia and Perel, ‘Is It Time to Abolish Safe Harbor? When Rhetoric Clouds Policy Goals’ Stanford Law & Policy Review, Forthcoming (2019) at 47 (accessible at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3344213). Back
    9. See further Comninos, ‘Intermediary liability in South Africa’ (2012) (accessible at https://www.apc.org/sites/default/files/Intermediary_Liability_in_South_Africa-Comninos_06.12.12.pdf). See also Rens, ‘Failure of Due Process in ISP Liability and Takedown Procedures’ in Global Censorship, Shifting Modes, Persisting Paradigms (2015) (accessible at https://law.yale.edu/sites/default/files/area/center/isp/documents/a2k_global-censorship_2.pdf).. Back
    10. Mail & Guardian, ‘The digital breadcrumbs behind the M&G’s censorship attack’ (2019) (accessible at https://mg.co.za/article/2019-10-04-00-the-digital-breadcrumbs-behind-the-mgs-censorship-attack/). Back
    11. Media Defence above n 6 at 28. Back
    12. UNHRC, ‘Disinformation and freedom of opinion and expression’ The promotion, protection and enjoyment of human rights on the Internet’ (2021) (accessible at https://www.ohchr.org/en/documents/thematic-reports/ahrc4725-disinformation-and-freedom-opinion-and-expression-report Back
    13. For a detailed outline of the limitation of freedom of expression see Module 2 on Restricting Access and Content at 4 – 5. See further OSCE, ‘Media Freedom on the Internet: An OSCE Guidebook’ (2016) (accessible at https://www.osce.org/netfreedom-guidebook?download=true). Back
    14. For a more detailed discussion on the Bill see Walubengo and Mutemi, ‘Treatment of Kenya’s Internet Service Providers (ISPs) under the Kenya Copyright (Amendment) Bill, 2017’, The African Journal of Information and Communication (2019) (accessible at https://journals.co.za/docserver/fulltext/afjic_n23_a5.pdfexpires=1581473231&id=id&accname=guest&checksum=1AD5DE6F4FD5EA3A0CB45F94F3335E67). Back
    15. Ministry of Electronics and Information Technology ‘Draft Information Technology [Intermediaries Guidelines (Amendment)] Rules, 2018’ (2018) (accessible at https://www.meity.gov.in/writereaddata/files/Draft_Intermediary_Amendment_24122018.pdf). Back
    16. Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules (2021) (accessible at https://prsindia.org/billtrack/the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021). Back
    17. Keller, ‘Build Your Own Intermediary Liability Law: A Kit for Policy Wonks of All Ages’ in Li, ‘New Controversies in Intermediary Liability Law Essay Collection Yale Law School’ Information Society Project (2019) at 20 (accessible at https://law.yale.edu/sites/default/files/area/center/isp/documents/new_controversies_in_intermediary_liability_law.pdf) Back
    18. Li, ‘Beyond Intermediary Liability: The Future of Information Platforms’ Yale Law School Information Society Project (2018). Back