Back to main site

    Access to Content: Censorship, Blocking and Filtering

    Module 2: Restricting Access and Content

    Overview of censoring, blocking and filtering of content

    Access to information is a central tenet of the internet. However, efforts to restrict access have developed in step with improved infrastructure and technology that should enable access. Technical measures are being implemented in many jurisdictions by state and non-state actors to limit, influence, monitor, and control people’s access to the internet. These measures include censoring, blocking, filtering, and monitoring content. While these measures may not be as extreme as complete internet shutdowns, they equally hinder the full enjoyment of the right to freedom of expression and have the potential to severely distort and disrupt people’s access to information online.

    Censorship and blocking Filtering
    Typically refers to the prevention of access to specific websites, domains, IP addresses, protocols or services included on a blacklist.(1) Justifications for blocking often include the need to prevent access to illegal content, or content that is a threat to public order or is objectionable for a particular audience.(2) Generally refers to restricting or limiting access to information (or related services) that is either illegal in a particular jurisdiction, is considered a threat to public order, or is objectionable for a particular audience.

    Filtering can relate to the use of technology that blocks pages by reference to certain characteristics, such as traffic patterns, protocols or keywords, or based on their perceived connection to content deemed inappropriate or unlawful.

    Note: This distinction might be considered semantical, but it can also be considered a matter of scale and perspective. However, the key commonality is that they both limit access to the internet.(3)

    As explained by ARTICLE 19, there are different ways in which access to content can be restricted, for example:(4)

    • URL blocking blocks a specific web page.
    • IP address blocking prevents connection to a host.
    • Entire domain names can be blocked through DNS tampering.
    • Blacklisting compiles a list of URLs to be filtered, while whitelisted URLs are not subject to blocking or filtering.
    • Keyword blocking is generally used to enable the blocking of specific categories of content.

    The rise of disinformation has also contributed to an increase in blocking and filtering with states trying to mitigate the spread of false information, and, in some instances, legally permitting blocking and filtering in order to prohibit and punish the dissemination of false or inaccurate statements.

    Applicable international human rights standards

    The same general considerations relating to access, online rights and freedom of expression discussed above are applicable here, save for specific considerations relating to filtering and blocking. In 2011, in a Joint Statement on Freedom of Expression and the Internet, a collective of Special Rapporteurs and experts stated the following in relation to filtering and blocking:

    • Mandatory blocking of entire websites, IP addresses, ports, network protocols or types of uses (such as social networking) is an extreme measure – analogous to banning a newspaper or broadcaster – can only be justified in accordance with international standards, for example, where necessary to protect children against sexual abuse.
    • Content filtering systems which are imposed by a government or commercial service provider and which are not end-user controlled are a form of prior censorship and are not justifiable as a restriction on freedom of expression. 
    • Products designed to facilitate end-user filtering should be accompanied by clear information to end-users about how they work and their potential pitfalls in terms of over-inclusive filtering.

    In a 2016 Report, the UNSR on FreeEx explained that:

    “States often block and filter content with the assistance of the private sector. Internet service providers may block access to specific keywords, web pages or entire websites.  On platforms that host content, the type of filtering technique depends on the nature of the platform and the content in question.  Domain name registrars may refuse to register those that match a government blacklist; social media companies may remove postings or suspend accounts; search engines may take down search results that link to illegal content.  The method of restriction required by Governments or employed by companies can raise both necessity and proportionality concerns, depending on the validity of the rationale cited for the removal and the risk of removal of legal or protected expression.

    Ambiguities in State regulation coupled with onerous intermediary liability obligations could result in excessive filtering.  Even if content regulations were validly enacted and enforced, users may still experience unnecessary access restrictions.  For example, content filtering in one jurisdiction may affect the digital expression of users in other jurisdictions.  While companies may configure filters to apply only to a particular jurisdiction or region, there have been instances where they were nevertheless passed on to other networks or areas of the platform.”

    In a case in the European Court of Human Rights in which access to a lawful website was obstructed as a result of blocking measures applied to an illegal website, the Court stated that “when exceptional circumstances justify the blocking of illegal content, a State agency making the blocking order must ensure that the measure strictly targets the illegal content and has no arbitrary or excessive effects, irrespective of the manner of its implementation. Any indiscriminate blocking measure which interferes with lawful content or websites as a collateral effect of a measure aimed at illegal content or websites amounts to arbitrary interference with the rights of owners of such websites.”(5)

    Blocking and filtering in Ethiopia

    Ethiopia has repeatedly made use of blocking and filtering mechanisms in the recent past. Between 2012 and 2018, hundreds of websites were blocked, including the websites of LGBTQI+ organisations, media outlets and CSOs like the Electronic Frontier Foundation.(6) In 2017, during a spate of anti-government protests, Facebook, Twitter, WhatsApp, and Dropbox were frequently blocked.

    In 2018 Freedom House noted that with the change of regime, over 250 websites were unblocked. Despite this, politically motivated blocking and filtering has continued in Ethiopia (and the full internet shutdown in the Tigray region remains ongoing). As of 2021, Freedom House confirmed that there were still no procedures for determining which websites are blocked or for appealing blocking decisions.

    Blocking and filtering in Turkey

    Turkey’s government has recently received sustained criticism for the “systematic actions the Turkish government has taken to restrict Turkey’s media environment, including closing media outlets, jailing media professionals, and blocking critical online content.”(7) In 2018, Freedom House found that over 3300 URLs containing news items were blocked.

    In 2019, the Wikimedia Foundation, which owns and operates Wikipedia, petitioned the European Court of Human Rights (ECtHR) in relation to the blocking of Wikipedia in Turkey. Despite the outstanding petition to the ECtHR, in January 2020, following a ruling from the Turkish Constitutional Court, the Turkish government restored access to Wikipedia. The Constitutional Court ultimately found that blocking Wikipedia was unconstitutional.

    Blocking of Twitter in Nigeria

    In a prominent recent example of content blocking, the federal government of Nigeria in 2021 suspended social media site Twitter after it removed content posted by President Muhammadu Buhari which threatened to punish regional secessionists. The ban was in place for seven months before Twitter agreed to a number of the government’s demands, including opening a local office in Nigeria.

    The ban was subsequently declared unlawful by the ECOWAS Community Court of Justice in a case brought by the Socio-Economic Rights and Accountability Project (SERAP) and joined with other similar cases. The Court held that the ban violated the right to freedom of expression, access to information and the media and ordered the government to prevent such a repetition. Media Defence and Mojirayo Ogunlana-Nkanga represented the applicants.

    Blocking and filtering remain a contemporary concern. While in limited instances there may be justifiable limitations, generally such measures constitute an unjustifiable infringement and are often carried out with limited guidance to the public and limited to no regulation or oversight over the state.(8)

    Unjustifiable limitations

    As discussed above, and as with all limitations of the right to freedom of expression, restrictions are only permissible if they are provided by law, pursuant to a legitimate aim and conform to the strict tests of necessity and proportionality. In terms of “blanket” or “generic” bans, the 2011 UNHRC General Comment found that “generic bans on the operation of certain sites and systems are not compatible” with article 19 of the ICCPR. Where restrictions constitute “generic” bans, they will generally amount to an infringement of the right to freedom of expression.

    Justifiable limitations

    There may be circumstances where measures such as blocking and filtering of content are justifiable. The protection of children’s rights may be one such justification. Blocking and filtering techniques can be developed and utilised to prevent the proliferation of and exposure to damaging material and to protect children from harmful and illegal content. However, despite this important purpose, UNICEF’s 2017 Report on ‘Children’s Rights and Business in a Digital World: Freedom of Expression, Association, Access to Information and Participation’ has recognised the inherent concerns around blocking and filtering, including a lack of transparency; the unscrupulous nature of filters; the lack of evidence to show where and when they have been deployed; and the threat of legitimate content being limited.(9) The children’s rights example illustrates that even when there might appear to be a legitimate purpose, rights can be unduly limited if the elements of legality, necessity and proportionality are not thoroughly and independently tested.

    In digital rights litigation, practitioners will do well to test all tenets of the limitations analysis before determining the appropriateness or otherwise of an imposed restriction. The ECtHR, in its 2012 decision of Ahmet Yıldırım v Turkey, provides guidance on the limitations analysis in relation to blocking and filtering.

    Case note: Ahmet Yıldırım v Turkey

    The applicant owned and ran a website on which he published his academic work and his views on various topics. In 2009, the Denzili Criminal Court in Turkey ordered the blocking of the website as a preventative measure in the context of criminal proceedings against the site’s owner, who was accused of insulting the memory of Atatürk. The Court subsequently ordered the blocking of all access to Google Sites, a website hosting platform, as this was the only means of blocking the offending website. The applicant unsuccessfully tried to have the blocking order removed and applied to the ECtHR submitting that the blocking of Google Sites amounted to indirect censorship.

    The ECtHR held that the impugned measure amounted to a restriction stemming from a preventive order blocking access to a website. The ECtHR found that the impugned measure produced arbitrary effects and could not be said to have been aimed solely at blocking access to the offending website, since it consisted in the wholesale blocking of all websites hosted by Google Sites.

    The ECtHR reasoned that specific legal provisions are necessary, as general provisions and clauses governing civil and criminal responsibility do not constitute a valid basis for ordering internet blocking. Relying on General Comment 34, the Joint Declaration on Freedom of Expression and the Internet and the 2011 UNSR FreeEx Report, the ECtHR went further, stating:

    “In any case, blocking access to the Internet, or parts of the Internet, for whole populations or segments of the public can never be justified, including in the interests of justice, public order or national security. Thus, any indiscriminate blocking measure which interferes with lawful content, sites or platforms as a collateral effect of a measure aimed at illegal content or an illegal site or platform fails per se the “adequacy” test, in so far as it lacks a “rational connection”, that is, a plausible instrumental relationship between the interference and the social need pursued. By the same token, blocking orders imposed on sites and platforms which remain valid indefinitely or for long periods are tantamount to inadmissible forms of prior restraint, in other words, to pure censorship.”

    Furthermore, the ECtHR held that the judicial review procedures concerning the blocking of websites in Turkey are insufficient to meet the criteria for avoiding abuse, as Turkish domestic law does not provide for any safeguards to ensure that a blocking order in respect of a specific website is not used as a means of blocking access in general. Accordingly, the ECtHR found there had been a violation of the right to freedom of expression.

    In another case in the Turkish Constitutional Court in 2021, it was found that blocking access to news articles on account of a violation of reputation and personal rights unjustifiably infringed the right to freedom of expression and, again, that the domestic law which permitted the blocking provided no opportunity to realistically challenge the decision and no procedural safeguards against excessive and arbitrary internet-blocking measures.(10)

    Similar considerations relating to ligation in respect of internet shutdowns are applicable in the context of blocking and filtering. However, there are further practical considerations that might be of use to potential litigators and activists.

    Tips for measuring restrictions

    The Open Observatory of Network Interference is a useful, free resource that detects censorship and traffic manipulation on the internet. Their software can help measure:

    • Blocking of websites.
    • Blocking of instant messaging apps (WhatsApp, Facebook Messenger and Telegram).
    • Blocking of censorship circumvention tools (such as Tor).
    • Presence of systems (middleboxes) in your network that might be responsible for censorship and/or surveillance.
    • Speed and performance of your network.

    This tool can be a helpful way to collect data that can be used as evidence of restrictions to access.

    Conclusion

    Activists and litigators should remain vigilant in relation to blocking and filtering and, where necessary, apply the principles of legality, proportionality, and necessity to establish when the restriction of content amounts to a rights violation. As international pressure against full-scale internet shutdowns mounts, litigators should be cognisant that blocking and filtering may increase as a popular measure to restrict the free flow of information.

    More Resources on Restricting Online Content

    Footnotes

    1. ARTICLE 19, ‘Freedom of Expression Unfiltered: How blocking and filtering affect free speech’ (2016) at 7 (accessible at https://www.article19.org/data/files/medialibrary/38586/Blocking_and_filtering_final.pdf). Back
    2. Internet Society, ‘Internet Society Perspectives on Internet Content Blocking: An Overview’ (2017) (accessible at https://www.internetsociety.org/resources/doc/2017/internet-content-blocking/). Back
    3. Id. See further Barnes, ‘Technical Considerations for Internet Service Blocking and Filtering’ (2013) (accessible at https://tools.ietf.org/id/draft-iab-filtering-considerations-03.html). Back
    4. ARTICLE 19 above n 7 at 9. Back
    5. European Court of Human Rights, Vladimir Kharitonov v. Russia (application No. 10795/14), (2020), para. 46 (accessible at: https://hudoc.echr.coe.int/fre#{%22itemid%22:[%22001-203177%22]}). Back
    6. Access Now, ‘Ethiopia: Verifying the unblocking of websites,’ (2018) (accessible at: https://www.accessnow.org/ethiopia-verifying-the-unblocking-of-websites/). Back
    7. U.S. Mission to the United Nations ‘Remarks at a UN Third Committee Dialogue with the Special Rapporteur on the Freedom of Expression’ (2019) (accessible at https://usun.usmission.gov/remarks-at-a-un-third-committee-dialogue-with-the-special-rapporteur-on-the-freedom-of-expression/) Back
    8. UNICEF ‘Children’s Rights and Business in a Digital World: Freedom of Expression, Association, Access to Information and Participation  (2017) at 11 (accessible at https://www.unicef.org/csr/css/UNICEF_CRB_Digital_World_Series_EXPRESSION.pdf). Back
    9. Id at 12. Back
    10. Global Freedom of Expression: Columbia University, ‘The Case of Keskin Kalem Yayıncılık v. Ticaret A.Ş.,’ (2021) (accessible at: https://globalfreedomofexpression.columbia.edu/cases/the-case-of-keskin-kalem-yayincilik-v-ticaret-a-s/). Back