Back to main site

    The Right to Privacy

    Module 1: General Overview of Trends in Digital Rights Globally

    In the last decade, there have been considerable developments relating to the exercise of the right to privacy online.

    Data Privacy

    2016 saw the coming into force of the General Data Protection Regulation (GDPR) in Europe, with widespread consequences for regions around the world as well. It exposed the increasing need to protect the right to privacy in a rapidly changing technological landscape and prompted rapid legislative change in many countries seeking to maintain trade and data flows with the European region.

    Comprehensive data protection laws are vital for securing human rights in the digital age, and the GDPR developed, for the first time, some new safeguards that are necessary for the advancement of human rights in the digital age.(1) In particular, it protects people against gratuitous and excessive data collection. In the years since, the sum of fines issued under the GDPR has skyrocketed, reaching a total of over EUR4 billion by the end of 2023.(2) In 2023, the European Data Protection Board reported that the GDPR has strengthened, modernised, and harmonised data protection principles across the EU.(3) Despite this, there have been notable challenges in implementing the GDPR, enforcing fines and rulings made under it, and ensuring compliance with it across the continent, and important interpretative decisions are ongoing.(4)

    Another flagship data protection law, the California Consumer Privacy Act (CCPA), also came into effect in January 2020, seeking to address how private companies are allowed to collect and use the data of California residents. The CCPA allows residents of California to know:

    • What personal information a data company has collected about them.
    • What personal information third parties have obtained about them.
    • The specific personal information a company has compiled about them.
    • Specific inferences that have been made about them based on their personal information.(5)

    The CCPA undoubtedly increases data privacy protections and sends a strong message that “[i]n a GDPR + CCPA world, negligence of data privacy protections will not be tolerated and will result in higher fines.”(6)

    The GDPR and CCPA set off a wave of other countries passing revised or new data privacy laws which are aimed at protecting people’s data in the modern age. The UN Conference on Trade and Development (UNCTAD) has found that of the 194 countries they reviewed:

    • 71% of countries have data protection legislation.
    • 9% of the states have draft legislation.
    • 15% of countries have no legislation.
    • 5% of countries have no data available.

    Mapping the state of data protection in Africa

    Data protection legislation is crucial to protecting the right to privacy in the digital age. The progression of legislation and regulation in this area has been rapid in Africa in recent years.

    dataprotection.africa is an open, online resource that aims to provide a detailed analysis of the governance of data protection across the continent, mapping and analysing the legislation in place in all 55 member states of the African Union.

    At present, 36 African countries have passed data protection laws, with three further being in the process of considering drafts. Most recently, Tanzania, Uganda and Eswatini passed new data protection laws in 2022 and Nigeria and Somalia in 2023. Kenya also passed new regulations to their data protection law in 2021, in an effort to strengthen their existing law.

    Also of significance was the coming into force of the long-awaited African Union Convention on Cyber Security and Personal Data Protection (Malabo Convention) in 2023. The Convention aims to create a comprehensive legal framework for electronic commerce, data protection, and cybercrime and cybersecurity on the continent and requires all 55 AU member states to have domestic laws in each of these policy areas which conform to various standards and principles outlined in the Convention.

    While many countries have data protection frameworks in place, there is a significant lack of implementation of these frameworks, with many countries failing to establish or appoint data protection authorities to enforce these laws or failing to provide these authorities with the independence and resources needed to act effectively.(7) In addition, there have been significant enforcement challenges, with many data protection authorities on the continent struggling with a lack of independence and resources.(8)

    Cross-border transactions and multinational corporations that function across multiple jurisdictions require data protection regulations, demonstrating the importance of data protection to enabling trade. African states are increasingly recognising the need to enact data protection laws and the focus should now shift towards ensuring the content of these laws meaningfully enable fundamental rights and ensuring that laws and implemented and enforced. For example, many laws contain exemptions that limit the scope and effectiveness of the law, such as for public or law enforcement agencies,(9) and there may be a need for such laws to be updated to account for the new data protection challenges of artificial intelligence (AI).

    More Resources on Data Protection

    Surveillance

    Mass and targeted surveillance practices are on the rise, and there is a notable absence of international legal frameworks and strict safeguards in place. State-led surveillance is frequently implemented without underlying legal regulation and in a way that lacks transparency and accountability, initiatives which are a genuine affront to the right to privacy.

    United Kingdom South Africa
    The ECtHR has addressed the British government’s powers to engage in surveillance, holding that the country’s bulk surveillance programme was a violation of the right to privacy and the right to freedom of expression under the European Convention on Human Rights due to a lack of independent oversight, an overly broad application of surveillance, and a failure to sufficiently protect journalists’ confidential communication.(10) A new Bill that was introduced in 2023 to amend the country’s laws faced public backlash, with commentators arguing that it could threaten technological innovation.(11) The UK has also recently introduced the Online Safety Bill, about which some have expressed concerns regarding clauses which could mandate mass surveillance of private digital communications.(12) In South Africa, the Constitutional Court in 2021 declared various provisions of the domestic surveillance law to be unconstitutional as a result of a complaint brought by an investigative journalist whose communications had been monitored by intelligence officials; the Court ordered a range of amendments to improve transparency, safeguards, and oversight mechanisms state surveillance operations.(13) In 2023, South Africa introduced a Bill to amend the unconstitutional law, which has generated much public outcry, with commentators arguing that the Bill falls short of what was demanded by the judgment and fails to address other long-standing issues.(14) Despite this, the Bill has been passed by Parliament and is awaiting signature by the President.(15)

    As ever-more sophisticated technologies are developed, such as biometric surveillance, facial recognition technology, and data analysis using artificial intelligence, the issue of surveillance is only going to grow as a concern for digital rights. Although effective litigation and advocacy can result in important protections and safeguards, it is clear that there is still much to be done by states to put in place more robust legal frameworks and strict safeguards relating to surveillance in the future to avoid such challenges and to protect privacy rights.

    Surveillance and press freedom

    In recent years, the use of sophisticated surveillance technology on mobile phones has gained increasing prominence amidst concerns about its extensive abuse to monitor political opponents and activists. In 2021, news broke that at least 180 journalists across 21 countries had been targeted for surveillance by the Pegasus spyware, a system that can be remotely installed on a smartphone enabling complete control over the device.(16) The prevalence and seeming unrestricted usage of such technologies is deeply concerning for the right to freedom of expression, particularly considering its usage in many contexts in which the safety of journalists continues to be seriously at risk.

    The Supreme Court of India in 2021 ordered an independent inquiry into allegations that the government deployed the Pegasus spyware against various journalists, politicians and dissidents, and found that the free press’s democratic function was at stake, and that “such chilling effect on the freedom of speech is an assault on the vital public watchdog role of the press, which may undermine the ability of the press to provide accurate and reliable information.”(17)

    Africa has unfortunately not been immune to these trends. African activists and journalists were among some of the targets identified in the Pegasus scandal, as were powerful politicians and state officials revealed to be users of the tools. In 2024, Reporters without Borders found spyware traces on the phones of two Togolese journalists while they were on trial for defamation against a government minister.(18)

    The use of video surveillance and closed-circuit television (CCTV) is also a common surveillance occurrence across the world, including in combination with facial recognition technology (FRT). State and non-state actors frequently invoke security threats to justify the widespread use of video surveillance and FRT. This form of surveillance and monitoring is susceptible to an array of abuses.

    The American Civil Liberties Union has identified the following:

    • Institutional abuse.
    • Abuse for personal gain.
    • Discretionary targeting.
    • Voyeurism.
    • Location monitoring.

    Such surveillance is often unregulated or under-regulated and can have a chilling effect on public life, in addition to creating risks of being abused to monitor critics or activists, target marginalised groups, and to collect excessive data, often without consent. The quality and sophistication of video surveillance are also becoming more salient, with concerns, for example, that data from video surveillance systems can be combined with other forms of private and public information to create incredibly detailed profiles of people. Conversely, while such surveillance systems are often invasive, the potential inaccuracy and fallibility of the technology is also a concern, with a growing body of evidence that FRT systematically misidentifies certain populations and is vulnerable to discrimination and bias.(19)

    A 2021 report by Thales (a large producer of surveillance technologies) recorded the following top seven trends of facial recognition:(20)

    • Facial recognition technologies are increasingly used to identify and verify a person using their facial features by capturing, analysing, and comparing patterns based on the person’s facial details.
    • Facial recognition technologies are predominately used for security and law enforcement, health and marketing, and retail.

    Despite calls for moratoriums on certain uses of these kinds of technology, it is clear that facial recognition technology is here to stay, with expected industry growth of $5.71 billion (USD) in 2024 globally.(21) It is also increasingly being used for surveillance, including across Africa. Fortunately, a wave of activism has recently begun to raise awareness about the potential rights implications of these technologies, with some notable successes in both litigation and policy change.

    Legal developments in FRT

    In 2023, the United States introduced the Facial Recognition and Biometric Technology Bill which, if passed, will ban the use of facial recognition by the federal government unless explicitly approved by an Act of Congress. This Bill is a significant step in a nationwide movement to ban government use of face surveillance technology.(22) On calling for such bans, activists frequently cite the discriminatory effects of such technology and its potential risks to privacy, freedom of expression, information security, and social justice.

    The European Union’s GDPR also classifies biometric data as a special type of data and prohibits people from processing it unless the processing thereof falls into one of the lawful categories.(23) In Sweden, a school was fined for taking attendance through FRT, as this processing did not fall into one of the lawful processing categories.(24)

    In addition, the new EU AI Act unveiled in 2021, “aims to limit the use of biometric identification systems including facial recognition that could lead to ubiquitous surveillance” by introducing new rules for its use hinging on whether it is defined as “high-risk” or “low risk” usage.(25) However, the proposed provisions have been watered down in subsequent negotiations,(26) as the Act continues to progress through the final stages of approval.(27)

    In Brazil, a civil court in São Paulo held that the use of facial recognition technology on a subway line infringed the right to privacy and freedom of expression due to the lack of consent from users, and the subway operator was ordered to stop using the technology.(28)

    The collection of biometric data

    Biometric data collection entails the identification and authentication of a person based on unique biological characteristics. FRT is considered a form of biometric data that is specifically widely used for surveillance purposes. According to a 2023 Review of biometrics by Thales, biometric technologies are most frequently used for the following:(29)

    • Law enforcement and public security: identifying criminals, suspects and victims.
    • Military: identifying enemies and allies.
    • Border, travel, and migration control: identifying travellers, passengers, and nationality.
    • Civil identification: identifying citizens, residents and voters.
    • Healthcare and subsidies: identifying patients, beneficiaries, and healthcare professionals.
    • Physical and logistical access: identifying owners, users, employees and contractors.
    • Commercial applications: identifying consumers and customers.

    The use of biometric technology is proliferating at a rapid rate, causing significant concern with regard to human rights. States are often ill-equipped to deal with the security and data storage challenges that come with collecting and storing such sensitive personal information, and examples of biometrics being used either for nefarious purposes or to the exclusion of already-marginalised populations abound. There are also growing concerns that the frequent use of biometric technologies has become unduly intrusive, contributing to the bourgeoning network of surveillance technologies. Civil liberties organisation Liberty has noted that:(30)

    “Use of big data and new technologies is often viewed as a panacea for the challenges that modern-day law enforcement faces. Technologies such as mobile fingerprint scanners, facial recognition and mobile phone data extraction, used in conjunction with one another and police super-databases, risk changing the relationship between the individual and the state, creating a society in which anonymity is the exception, and pervasive surveillance is the norm.”

    As with most technologies, the positive potential is significant, but the potential for rights violations is often ignored or underestimated. Some advocates argue that biometrics can be particularly useful in electoral settings by potentially:(31)

    • Improving voter registration and identification;
    • Producing a credible electoral register; and
    • Reducing electoral fraud.

    Biometrics and elections in Africa

    Public opinion on the use of biometrics in elections across Africa varies. Biometric technologies have been heralded as having huge potential to curb electoral fraud and ensure that each person can only vote once. However, there are also concerns about implementation challenges, including technological failures and privacy concerns.(32) High costs, limited data literacy, and ineffective data protection regimes may cause serious risks to privacy. There have also been examples of high levels of exclusion of certain populations and abuse by governments embracing the trend of rising digital authoritarianism.

    Despite mixed views, the use of biometrics for voting continues to be on the rise. For example, in 2024 Cameroon’s biometric voter registration drive aims to enrol 7.5 million voters ahead of its 2025 general elections.(33) Other African countries such as Ghana, Nigeria and Uganda, amongst others, are similarly implementing or considering implementing biometric systems for elections. CIPESA has documented the deployment of other national biometric technology-based programmes in 16 African countries in recent years.(34)

    Anonymity and encryption

    Encryption and anonymity are meant to “provide individuals and groups with a zone of privacy online to hold opinions and exercise freedom of expression without arbitrary and unlawful interference or attacks.”(35) In the 2018, the UNSR on FreeEx observed that users encounter heightened challenges with digital security tools, exacerbated by conflicting perspectives between states and personal privacy:(36)

    “the challenges users face [in using these tools] have increased substantially, while States often see personal, digital security as antithetical to law enforcement, intelligence, and even goals of social or political control. As a result, competing trends and interests have led, on the one hand, to a surge in State restrictions on encryption and, on the other hand, increased attention to digital security by key sectors of the private Information and Communications Technology (“ICT”) sector.”

    As society’s reliance on digital technologies has increased, users have become increasingly aware of the value of encryption as a tool to protect private communications in the digital era. This is particularly true for users such as journalists, activists, and lawyers, for whom the protection of communications is not merely a personal but also a professional imperative. In parallel with the rise in digital surveillance and cybercrimes discussed above, encryption has become a protective tool for the average internet user rather than something specialised, technical, and out of reach, as it was a few years ago. The United Nations Special Rapporteur on Freedom of Expression has highlighted that “encryption and anonymity enable individuals to exercise their rights to freedom of opinion and expression in the digital age and, as such, deserve strong protection.”(37)

    Simultaneously, the rise of social media as a powerful platform for communication has enabled greater anonymity. States, particularly law enforcement agencies, have begun to push back against this growing use of encryption and anonymity, ostensibly in the interest of safety and security. Many countries in Africa have relatively heavy restrictions on encryption, although Russia and parts of Asia have globally the heaviest restrictions.(38)

    Anonymity on social media

    While threats to encryption are frequently seen to be mere fronts for authoritarian attempts to control the flow of information and disproportionate efforts to crack down on crime, online anonymity has also drawn contested debates about the need to ensure accountability for online harms while protecting freedom of expression in digital spaces. For example, social media users in LGBTQIA communities have cited the importance of online anonymity in facilitating safe discussions about sexuality in environments where such discussions might put them at risk.(39)

    CIPESA has reported that state agencies in several African countries can request for decryption of data held by service providers, potentially undermining the very essence of encryption services. For example:(40)

    • Nigeria’s Social Media Bill was introduced in 2019 and, as of 2024, has not yet been passed. However, concerns around the bill’s impact on encryption and anonymity resurfaced after the Nigerian President announced in 2023 that the bill had been submitted to parliament.(41) This bill will allow government to examine internet traffic to determine its content, by, for example, restricting the use of end-to-end encryption or requiring the content to be decrypted.(42)
    • In Zimbabwe, the Interception of Communications Act mandates cryptography services to decrypt data at judicial authorities’ request, with non-compliance punishable by fines or imprisonment.(43)

    As challenges to privacy rise, so too will the need to secure anonymity and promote the use of encryption technologies, particularly for journalists, lawyers, activists, and others at risk of oppression. These technologies will continue to develop and become more sophisticated, but as they do, the threat of increased state intrusions in the private lives of citizens and attempts to weaponise and abuse such technologies are also likely to increase.

    More Resources on Surveillance & Encryption

    Artificial intelligence

    The growing prevalence and use of artificial intelligence (AI), particularly publicly available generative AI tools such as ChatGPT and Microsoft Bing, are raising new questions about the widespread collection of personal information to both train these systems and to provide responses to user prompts and the resulting risks to data protection and privacy. Many of these tools have reportedly been trained on the entirety of publicly available information on the internet, which would include personal information shared on social media and other sites, without users having given consent for this use.

    Efforts to regulate AI are also, as a result, increasing, most notably through the EU’s AI Act, which seeks to “regulate artificial intelligence (AI) to ensure better conditions for the development and use of this innovative technology.”(44) It also includes limitations on the use of biometric identification systems by law enforcement and seeks to enable data subjects to receive meaningful explanations about the use of these systems.(45) The European Parliament reached a provisional agreement on the Act in December 2023, setting the scene for it to be passed into law in the coming months.

    Footnotes

    1. Human Rights Watch, ‘The EU General Data Protection Regulation,’ (2018) (accessible at https://www.hrw.org/news/2018/06/06/eu-general-data-protection-regulation). Back
    2. EQS, ‘The Biggest GDPR Fines of 2023,’ (2024) (accessible at https://www.eqs.com/compliance-blog/biggest-gdpr-fines/). Back
    3. European Data Protection Board, ‘Contribution of the EDPB to the report on the application of the GDPR under Article 97’ (2023) (accessible at https://edpb.europa.eu/system/files/2023-12/edpb_contributiongdprevaluation_20231212_en.pdf). Back
    4. CMS, ‘GDPR Enforcement Tracker Report,’ (2023) (accessible at https://cms.law/en/deu/publication/gdpr-enforcement-tracker-report/executive-summary). Back
    5. New York Times, ‘How California’s New Privacy Law Affects You’ (2020) (accessible at https://www.nytimes.com/2020/01/03/us/ccpa-california-privacy-law.html). Back
    6. PWC, ‘Top Policy Trends 2020: Data privacy’ (2020) (accessible at https://www.pwc.com/us/en/library/risk-regulatory/strategic-policy/top-policy-trends/data-privacy.html). Back
    7. Accessible at: www.dataprotection.africa. Back
    8. Access Now, ‘Strengthening data protection in Africa: Key issues for implementation,’ (2024) (accessible at https://www.accessnow.org/wp-content/uploads/2024/01/Strengthening-data-protection-in-Africa-key-issues-for-implementation-updated.pdf). Back
    9. Id. Back
    10. Big Brother Watch v. The United Kingdom (Big Brother I) App nos. 58170/13, 62322/14 and 24960/15 (2018) (accessible at: https://globalfreedomofexpression.columbia.edu/cases/big-brother-watch-v-united-kingdom/). Back
    11. techUK, ‘Expressing techUK members’ concerns regarding the Investigatory Powers (Amendment) Bill’ (2023) (accessible here). Back
    12. Just Security, ‘Changes to UK Surveillance Regime May Violate International Law’ (2023) (accessible here). Back
    13. AmaBhungane Centre for Investigative Journalism NPC and Another v Minister of Justice and Correctional Services and Others ZACC 3 (2021) (accessible at: http://www.saflii.org/za/cases/ZACC/2021/3.html. Back
    14. Intelwatch, ‘Submission: What’s wrong with the RICA bill’ (2023) (accessible here). Back
    15. Parliamentary Monitoring Group, ‘Regulation of Interception of Communications and Provision of Communication-related Information Amendment Bill’ (2023) (accessible here). Back
    16. Forbidden Stories, ‘Journalists Under Surveillance,’ (2021) (accessible at: https://forbiddenstories.org/pegasus-journalists-under-surveillance/). Back
    17. RSF, ‘In first for Togo, RSF identifies spyware on phones of two Togolese journalists,’ (2024) (accessible at https://rsf.org/en/first-togo-rsf-identifies-spyware-phones-two-togolese-journalists). Back
    18. European Digital Rights Initiative, ‘Facial recognition and fundamental rights 101,’ (2019) (accessible at https://edri.org/our-work/facial-recognition-and-fundamental-rights-101/). Back
    19. Thales, ‘Facial recognition: top 7 trends (tech, vendors, use cases)’ (2021) (accessible at https://www.thalesgroup.com/en/markets/digital-identity-and-security/government/biometrics/facial-recognition). Back
    20. ACLU, ‘The Fight to Stop Face Recognition Technology’ (2023) (accessible at https://www.aclu.org/news/topic/stopping-face-recognition-surveillance). Back
    21. European Union, ‘General Data Protection Regulation’ (2018) (accessible at https://gdpr-info.eu/). Back
    22. European Parliament, ‘Regulating facial recognition in the EU,’ (2021) (accessible at https://www.europarl.europa.eu/RegData/etudes/IDAN/2021/698021/EPRS_IDA(2021)698021_EN.pdf). Back
    23. IAPP, ‘EU countries vote unanimously to approve AI Act,’ (2024) (accessible at https://iapp.org/news/a/eu-countries-vote-unanimously-to-approve-ai-act/). Back
    24. The Case of São Paulo Subway Facial Recognition Cameras (2021) (accessible at https://globalfreedomofexpression.columbia.edu/cases/the-case-of-sao-paulo-subway-facial-recognition-cameras/). Back
    25. Thales, ‘Biometrics: definition, use cases, latest news,’ (2023) (accessible at https://www.thalesgroup.com/en/markets/digital-identity-and-security/government/inspired/biometrics). Back
    26. Liberty, ‘Rights Groups Urge Shops To Reject Facial Recognition,’ (accessible at https://www.libertyhumanrights.org.uk/news/blog/papers-please-how-biometric-id-checks-put-our-rights-risk). Back
    27. Duduetsang Mokoele and Nomaqhawe Moyo, ‘Biometric voting has many pitfalls, but it could work in South Africa’ (2019) Daily Maverick (accessible at https://www.dailymaverick.co.za/opinionista/2019-05-08-biometric-voting-has-many-pitfalls-but-it-could-work-in-south-africa/). Back
    28. Aratek, ‘How Biometrics Is Becoming a Norm of Elections in Africa’ (2022) (accessible at https://www.aratek.co/news/how-biometrics-is-becoming-a-norm-of-elections-in-africa). Back
    29. Biometric Update.com, ‘Cameroon launches last full-cycle biometric voter registration before 2025 polls’ (2024) (accessible at https://www.biometricupdate.com/202401/cameroon-launches-last-full-cycle-biometric-voter-registration-before-2025-polls). Back
    30. CIPESA, ‘State of Internet Freedom in Africa 2022: The Rise of Biometric Surveillance’ (2022) (accessible at https://cipesa.org/2022/09/state-of-internet-freedom-in-africa-2022-the-rise-of-biometric-surveillance/). Back
    31. UNHRC, ‘UNSR on FreeEx: Report on encryption, anonymity, and the human rights framework’ (2015) (accessible at https://www.ohchr.org/en/calls-for-input/report-encryption-anonymity-and-human-rights-framework). Back
    32. UNSR on FreeEx, ‘Encryption and Anonymity follow‑up report’ (2018) (accessible at https://www.ohchr.org/sites/default/files/Documents/Issues/Opinion/EncryptionAnonymityFollowUpReport.pdf). Back
    33. Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression (2015) (accessible at: https://documents-dds-ny.un.org/doc/UNDOC/GEN/G15/095/85/PDF/G1509585.pdf?OpenElement). Back
    34. Comparitech, ‘Encryption laws: Which governments place the heaviest restrictions on encryption?’ (2022) (accessible at https://www.comparitech.com/blog/vpn-privacy/encryption-laws/). Back
    35. The Conversation, ‘Online abuse: banning anonymous social media accounts is not the answer’ (2021) (accessible at https://theconversation.com/online-abuse-banning-anonymous-social-media-accounts-is-not-the-answer-170224). Back
    36. CIPESA, ‘Policy Brief: How African States Are Undermining the Use of Encryption,’ (2021) (accessible at https://cipesa.org/2021/10/policy-brief-how-african-states-are-undermining-the-use-of-encryption/). Back
    37. Africa News, ‘Nigeria proposes new social media regulations’ (2023) (accessible at https://www.africanews.com/2023/10/11/nigeria-proposes-new-social-media-regulations/). Back
    38. Internet Society, ‘Internet Impact Brief’ (2022) (accessible at https://www.internetsociety.org/wp-content/uploads/2022/02/IIB-Nigeria-Social-Media-Bill.pdf). Back
    39. CIPESA, ‘How African Governments Undermine the Use of Encryption’ (2021) (accessible at https://cipesa.org/wp-content/files/briefs/How-African-Governments-Undermine-the-Use-of-Encryption-Oct-26.pdf). Back
    40. European Parliament, ‘EU AI Act: first regulation on artificial intelligence,’ (2023) (accessible at https://www.europarl.europa.eu/news/en/headlines/society/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence). Back
    41. European Parliament, ‘Artificial Intelligence Act: deal on comprehensive rules for trustworthy AI,’ (2023) (accessible at https://www.europarl.europa.eu/news/en/press-room/20231206IPR15699/artificial-intelligence-act-deal-on-comprehensive-rules-for-trustworthy-ai). Back