The rise of the ‘big tech giants’ has occasioned dramatic changes in the global information eco-system, including the ways in which people access news and public information, and how the news media generates income and connects with their audiences.

It has also created complications with how illegal or rights-violating content, such as hate speech, is dealt with.  Who is responsible for the publication of such content?  What happens when it goes viral, and has real-world impacts on peoples’ lives? What is the role of internet intermediaries, such as internet service providers, search engines or social media platforms in all this? The concept of ‘intermediary liability’ has developed as a partial response to some of these questions.

The common understanding is that providers of online services are not responsible for the information published by users using their services, an early interpretation from Section 230 of the United States Communications Decency Act (CDA).  Many say the CDA provides certainty which enables free speech, but in recent years, as extremism and misinformation have flourished online, the concept is once again up for debate.

Summary Modules

Media Defence has developed a series of summary modules that provide an overview of the right to freedom of expression and how it has developed in the modern digital age.  Intermediary liability is a concept that is relevant to many aspects of digital rights.

For example, it has implications for the publication of defamation or hate speech.  It affects who is responsible for cybercrimes committed over online platforms, and for the blocking and filtering of content by internet providers.  It also has consequences for data protection and privacy, in terms of what content is allowed by providers to remain online (known as the ‘right to be forgotten’).

Advanced Modules

Intermediary liability protection is increasingly under threat as governments seek to police crimes committed online and clamp down on polarising misinformation or disinformation online.    While there are some justifiable policy goals in holding intermediaries accountable for the content that they allow to flourish on their platforsm, there are also arguments against lifting intermediary liability protection altogether, as it is seen to be one of the core operating principles of the early internet that allowed online innovation to flourish.

Media Defence’s series of Advanced Modules on Digital Rights and Freedom of Expression online provide a more comprehensive review of current developments and jurisprudence in the field of digital rights.  In combination with the Summary Modules above, these resources form the basis of our introductory and advanced litigation surgeries.  The Advanced Modules have been designed to assist lawyers representing journalists, bloggers and other online media in East, West and Southern Africa.

They include more detailed interrogations of how intermediary liability affects access to content online, and the power wielded by private actors in determining what content is accessible to the public.  They also delve deeper into the ‘right to be forgotten’ and how the regulation of online content affects data protection and privacy for users.

Key Case Law

Litigation in the field of digital rights, and in particular the ‘right to be forgotten,’ has advanced significantly in recent years as a result of a few seminal cases, particularly in the European Union.  Nevertheless, intermediary liability remains one of the most contested areas of internet regulation around the world, and is treated in varying ways in different jurisdictions.

The European Court of Human Rights (ECtHR) found that an online news portal was liable for offensive comments they allowed to be posted below one of their news articles.

See judgment.

The Supreme Court of India found that intermediaries are only liable when they have received actual knowledge from a court order, or been notified by government, and subsequently failed to remove or disable access to the information.

See judgment.

The Supreme Court of Argentina held that search engines are under no duty to monitor the legality of third party content to which they link, and only in exceptional circumstances could they be required to disable access.

See judgment.

The European Court of Human Rights found that internet portals assume duties and responsibilities particularly in the context of offensive and vulgar comments — even if the speech is not unlawful.

See judgment.

Media Defence’s Work

Additional Resources