Back to main site

    Intermediary Liability

    Module 3: Access to the Internet

    Intermediary liability occurs where governments or private litigants can hold technological intermediaries, such as ISPs and websites, liable for unlawful or harmful content created by users of those services.(1) This can occur in various circumstances, including copyright infringements, digital piracy, trademark disputes, network management, spamming and phishing, “cybercrime”, defamation, hate speech, child pornography, “illegal content”, offensive but legal content, censorship, broadcasting and telecommunications laws and regulations, and privacy protection.(2)

    A report published by UNESCO identifies the following challenges facing intermediaries:(3)

    • Limiting the liability of intermediaries for content published or transmitted by third parties is essential to the flourishing of internet services that facilitate expression.
    • Laws, policies, and regulations requiring intermediaries to carry out content restriction, blocking, and filtering in many jurisdictions are not sufficiently compatible with international human rights standards for freedom of expression.
    • Laws, policies, and practices related to government surveillance and data collection from intermediaries, when insufficiently compatible with human rights norms, impede intermediaries’ ability to adequately protect users’ privacy.
    • Whereas due process generally requires that legal enforcement and decision-making are transparent and publicly accessible, governments are frequently opaque about requests to companies for content restriction, the handover of user data, and other surveillance requirements.

    There is general agreement that insulating intermediaries from liability for content generated by others protects the right to freedom of expression online.  Such insulation can be achieved either through a system of absolute immunity from liability, or a regime that only fixes intermediaries with liability following their refusal to obey an order from a court or other competent body to remove the impugned content.

    As to the latter, the 2011 Joint Declaration provides that intermediaries should only be liable for third party content when they specifically intervene in that content or refuse to obey an order adopted in accordance with due process guarantees by an independent, impartial, authoritative oversight body (such as a court) to remove it.(4) The 2019 Declaration of Principles on Freedom of Expression and Access to Information in Africa provides in Principle 39 that states should not require internet intermediaries to “proactively monitor content which they have not authored or otherwise modified” and to ensure that in moderating online content human rights safeguards are mainstreamed and all such decisions are transparently made with the possibilities for appeals and other remedies. It further provides that where law enforcement agencies request the immediate removal of online content because it poses an imminent risk of harm, such requests should be subject to judicial review.(5)

    While questions around intermediary liability have not yet been thoroughly considered by courts in Africa, a substantial body of jurisprudence is building up in other regions of the world, particularly Europe, Latin America, and India. For example, the ECtHR has considered intermediary liability in several cases:

    • In 2013, in the case of Delfi AS v Estonia, the ECtHR considered the liability of an internet news portal for offensive comments that were posted by readers below one of its online news articles.(6) The portal complained that being held liable for the comments of its readers breached its right to freedom of expression. The ECtHR dismissed the case, holding that the finding of liability by the domestic courts was a justified and proportionate restriction of freedom of expression because the comments were highly offensive; the portal failed to prevent them from becoming public, profited from their existence, and allowed their authors to remain anonymous. It further noted that the fine imposed by the Estonian courts was not excessive.
    • In 2016, in the case of Magyar Tartalomszolgáltatók Egyesülete and Zrt v Hungary, the ECtHR considered the liability of a self-regulatory body of internet content providers and an internet news portal for vulgar and offensive online comments posted on their websites.(7) The ECtHR reiterated that, although not publishers of comments in the traditional sense, internet news portals still had to assume duties and responsibilities. The ECtHR found that, although offensive and vulgar, the comment had not constituted unlawful speech, and upheld the claim of a violation of the right to freedom of expression.
    • In 2017, in the case of Tamiz v United Kingdom,the ECtHR had cause to consider the ambit of intermediary liability.(8) The applicant, a former politician in the United Kingdom, had claimed before the domestic courts that a number of third-party comments posted by anonymous users on Google’s were defamatory. Before the ECtHR, the applicant argued that his right to respect for his private life had been violated because the domestic courts had refused to grant him a remedy against the intermediary. His claim was ultimately dismissed by the ECtHR on the basis that the resulting damage to his reputation would have been trivial. The ECtHR highlighted the important role that ISPs perform in facilitating access to information and debate on a wide range of political, social and cultural rights, and seemed to endorse the line of argument that ISPs should not be obliged to monitor content or proactively investigate potential defamatory activity on their sites.

      Other courts have taken more definitive positions in respect of intermediary liability.  For example, the Supreme Court of India has interpreted the domestic law to only provide for intermediary liability where an intermediary has received actual knowledge from a court order, or where an intermediary has been notified by the government that one of the unlawful acts prescribed under the law are going to be committed and the intermediary has subsequently failed to remove or disable access to such information.(9) Furthermore, the Supreme Court of Argentina has held that search engines are under no duty to monitor the legality of third-party content to which they link, noting that only in exceptional cases involving “gross and manifest harm” could intermediaries be required to disable access.(10)

      The case of the non-consensual dissemination of intimate images (NCII), provides a challenge with regard to questions of intermediary liability. Courts around the world have frequently ordered the immediate and unequivocal removal of such content from online platforms, citing the significant and adverse consequences on victims’ and survivors’ rights to privacy and dignity. The High Court of Delhi, India, for example, ordered the immediate removal of content not only from the website on which it had been published, without consent, but also ordered search engines to de-index the content from their search results, stressing the need for “immediate and efficacious” remedies for victims of such cases.(11)

      This also relates to a concept known as ‘the right to be forgotten,’ which supporters argue creates an obligation on internet intermediaries to delete certain content on the request of a person who is the subject of such content. At present, the issue is being considered in multiple jurisdictions as the appropriate balance is sought between protecting the right to privacy and dignity and the right to access information of public importance.

      IIn light of the vital role played by intermediaries in promoting and protecting the right to freedom of expression online, it is imperative that they are safeguarded against unwarranted interference — by state and private actors — that could have a deleterious effect on the right. For example, as an individual’s ability and freedom to exercise their right to freedom of expression online is dependent on the passive nature of online intermediaries, any legal regime that causes an intermediary to apply undue restraint or self-censorship toward content communicated through their services will ultimately have an adverse effect on the right to freedom of expression online. The UNSR has noted that intermediaries can serve as an important bulwark against government and private overreach, as they are usually, for instance, best-placed to push back on a shutdown.(12) However, this can only truly be realised in circumstances where intermediaries are able to do so without fear of sanction or penalties.

      At the same time, it is vital that appropriate remedies are established for the removal of illegal or harmful content, and that powerful private platforms are held accountable for the decisions they make with regard to moderating content in the digital sphere, where such decisions may infringe on the rights to freedom of expression and access to information.


      1. Alex Comninos, ‘Theliability of internet intermediaries in Nigeria, Kenya, South Africa and Uganda: An uncertain terrain’ (2012) at p 6 (accessible at: Back
      2. Id. Back
      3. Rebecca MacKinnon et al, ‘Fostering freedom online: The orle of internet intermediaries’ (203) at pp 179-180 (accessible at: Back
      4. 2011 Joint Declaration at paras 2(a)-(b). Back
      5. Principle 39. Back
      6. Application No. 64569/09, 10 October 2013 (accessible at: httsp:// Back
      7. Application No 22947/13, 2 February 2016 (accessible at: Back
      8. Tamiz v United Kingdom, Application No. 3877/14, 19 September 2017 (accessible at: Media Defence, together with a coalition of organisations, made submissions to the ECtHR on proposed principles for intermediary liability based on best practices in national legislation, the views of the Committee of Ministers of the Council of Europe (CoE) and special mandate holders. The proposed principles are as follows: i) Intermediaries should not be the arbiters of the lawfulness of content posted, stored or transferred by the users of their services. ii) Assuming that they have not contributed to or manipulated content, intermediaries should not be liable for content posted, stored or transferred using their services unless and until they have failed to comply with an order of a court or other competent body to remove or block specific content. iii) Notwithstanding the above, intermediaries should in no circumstances be liable for content unless it has been brought to their attention in such a way that the intermediary can be deemed to have actual knowledge of the illegality of that content. iv) A requirement to monitor content on an ongoing basis is incompatible with the right to freedom of expression contained in article 10 of the European Convention on Human Rights. The submissions are accessible here: Back
      9. Shreya Singhal v Union of India, Application No. 167/2012 at paras 112-118 (accessible at: Back
      10. María Belén Rodriguez v Google, Fallo R.522.XLIX (accessible at:  The decision has been described in the 2016 Report of the UNSR on Freedom of Expression at para 52. Back
      11. 2017 Report of the UNSR on Freedom of Expression at para 50. Back