Intermediary Liability
Module 3: Access to the Internet
Intermediary liability occurs when governments or private litigants can hold technological intermediaries, such as ISPs and websites, liable for unlawful or harmful content created by users of those services.(1) This can occur in various circumstances, including:
- copyright infringements;
- digital piracy, trademark disputes;
- network management;
- spamming and phishing;
- cybercrime
- defamation;
- hate speech;
- child sexual exploitation material;
- illegal content;
- offensive but legal content;
- censorship;
- broadcasting and telecommunications laws and regulations; and
- privacy protection.(2)
A report published by UNESCO identifies the following challenges facing intermediaries:(3)
- Limiting the liability of intermediaries for content published or transmitted by third parties is essential to the flourishing of internet services that facilitate expression.
- Laws, policies, and regulations requiring intermediaries to carry out content restriction, blocking, and filtering in many jurisdictions are not sufficiently compatible with international human rights standards for freedom of expression.
- Laws, policies, and practices related to government surveillance and data collection from intermediaries, when insufficiently compatible with human rights norms, impede intermediaries’ ability to adequately protect users’ privacy.
- Whereas due process generally requires that legal enforcement and decision-making are transparent and publicly accessible, governments are frequently opaque about requests to companies for content restriction, the handover of user data, and other surveillance requirements.
There is general agreement that insulating intermediaries from liability for content generated by others protects the right to freedom of expression online. Such insulation can be achieved either through a system of absolute immunity from liability, or a regime that only fixes intermediaries with liability following their refusal to obey an order from a court or other competent body to remove the impugned content.
As to the latter, the 2011 Joint Declaration provides that intermediaries should only be liable for third-party content when they specifically intervene in that content or refuse to obey an order adopted in accordance with due process guarantees by an independent, impartial, authoritative oversight body (such as a court) to remove it.(4) The African Declaration provides in Principle 39 that states should not require internet intermediaries to “proactively monitor content which they have not authored or otherwise modified” and to ensure that in moderating online content human rights safeguards are mainstreamed and all such decisions are transparently made with the possibilities for appeals and other remedies. It further provides that where law enforcement agencies request the immediate removal of online content because it poses an imminent risk of harm, such requests should be subject to judicial review.(5)
Jurisprudence around the world
While questions around intermediary liability have not yet been thoroughly considered by courts in Africa, a substantial body of jurisprudence is building up in other regions of the world, particularly Europe, Latin America, and India. For example, in 2023 the Malaysian Communications and Multimedia Commission (MCMC) announced that it would take legal action against Meta for what it saw as a failure to promptly remove content deemed harmful.(6) This reportedly included matters related to race, royalty, religion, and instances of defamation, impersonation, online gambling, and fraudulent advertisements. Digital rights advocates argued that the MCMC’s threat of legal action against a social media platform for its content moderation decisions poses a potential risk to intermediary liability principles and online freedom of expression.(7)
The European Court of Human Rights (ECtHR) has considered intermediary liability in several cases:
- In Delfi AS v Estonia, the ECtHR examined the liability of an internet news portal for offensive comments posted by readers on its website.(8) The ECtHR ruled that holding the portal liable did not violate its right to freedom of expression, as the comments were highly offensive, the portal failed to prevent their publication, profited from them, and allowed anonymity for their authors.
- In Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v Hungary, the ECtHR addressed the liability of an internet news portal and a self-regulatory body for vulgar comments on their platforms.(9) While recognizing the duty of internet news portals to assume responsibilities, the ECtHR found that the comments did not constitute unlawful speech, upholding the right to freedom of expression.
- In Sanchez v France, the ECtHR departed from its previous decisions on imposing liability on social media users for third party content.(10) Sanchez, a French politician, was fined by a French domestic court for failing to remove hateful comments against the Muslim community from his Facebook wall. Before the ECtHR, Sanchez argued that this fine violated his right to freedom of expression by requiring him to bare the disproportionate burden of monitoring all comments posted on his open and public Facebook wall. The Court ultimately held that Sanchez’s right to freedom of expression had not been violated – France had interfered with it in a lawful and necessary manner, in a democratic society and to pursue a legitimate aim. It held that it was not disproportionate to attribute liability to all actors involved, including Sanchez for failing to take action in relation to blatantly discriminatory comments. Importantly, the Court held that Sanchez’s duty to act reasonably was greater in his capacity as a politician.
Other courts have taken more definitive positions in respect of intermediary liability. For example, the Supreme Court of India has interpreted the domestic law to only provide for intermediary liability where an intermediary has received actual knowledge from a court order, or where an intermediary has been notified by the government that one of the unlawful acts prescribed under the law are going to be committed and the intermediary has subsequently failed to remove or disable access to such information.(11)
Furthermore, the Supreme Court of Argentina has held that search engines are under no duty to monitor the legality of third-party content to which they link, noting that only in exceptional cases involving “gross and manifest harm” could intermediaries be required to disable access.(12)
Non-consensual dissemination of intimate images
The case of the non-consensual dissemination of intimate images (NCII), provides a challenge with regard to questions of intermediary liability. Courts around the world have frequently ordered the immediate and unequivocal removal of such content from online platforms, citing the significant and adverse consequences on victims’ and survivors’ rights to privacy and dignity.
- The High Court of Delhi, India, for example, ordered that intermediaries must remove all offending content from their platform in the case of NCII and not just the specific links provided by victims. The Court highlighted the damage caused by the posting of NCIIs and how victims being required to search the internet for new uploads for the purpose of requesting their removal can cause further trauma.(13)
- In an earlier case, the same Court ordered the immediate removal of content not only from the website on which it had been published, without consent but also ordered search engines to de-index the content from their search results, stressing the need for “immediate and efficacious” remedies for victims of such cases.(14)
In light of the vital role played by intermediaries in promoting and protecting the right to freedom of expression online, it is imperative that they are safeguarded against unwarranted interference — by state and private actors — that could have a deleterious effect on the right. For example, as an individual’s ability and freedom to exercise their right to freedom of expression online is dependent on the passive nature of online intermediaries, any legal regime that causes an intermediary to apply undue restraint or self-censorship toward content communicated through their services will ultimately have an adverse effect on the right to freedom of expression online.
The UNSR has noted that intermediaries can serve as an important bulwark against government and private overreach, as they are usually, for instance, best-placed to push back on a shutdown.(15) However, this can only truly be realised in circumstances where intermediaries are able to do so without fear of sanction or penalties.
At the same time, it is vital that appropriate remedies are established for the removal of illegal or harmful content, and that powerful private platforms are held accountable for the decisions they make with regard to moderating content in the digital sphere, where such decisions may infringe on the rights to freedom of expression and access to information.