Back to main site

    Non-Consensual Sharing of Intimate Images

    Module 2: Digital attacks and Online Gender-Based Violence


    • Image-based abuse: Non-consensual dissemination of intimate images (NCII) is considered one form of the broader category of image-based sexual abuse, which is, in turn, a form of technology-facilitated gender-based violence (TFGBV) or OGBV. Other forms of image-based abuse include “voyeurism/creepshots, sexploitation, sextortion, the documentation or broadcasting of sexual violence, and non-consensually created synthetic sexual media, including sexual deepfakes.”(1)
    • NCII: NCII “occurs when a person’s sexual images are shared with a wider than intended audience without the subject’s consent.”(2) It is irrelevant whether the person gave initial consent for the creation of the images or consent for them to be shared with other individuals; any dissemination beyond the initially intended audience can be said to constitute NCII. Intimate images can be in the form of either photos or videos and typically depict “nudity, partial nudity or sexually explicit acts.”(3) While NCII can and does affect people of all genders, research indicates that 90% of those victimised are women,(4) although LGBTQ persons and those with disabilities have also fallen victim.(5)
    • Technology enabled: Technological and cultural shifts, epitomised by ubiquitous phones with cameras and a vast digital audience, increase the ease of causing harm and exacerbate the consequences. Motivations behind such actions span a spectrum: from clandestine actors aiming to disrupt individuals’ lives to vengeful ex-partners; from seeking entertainment or validation among peers to profit-driven endeavours; and from cyberbullying tactics aimed at humiliation or control to various other motivations.(6)
    • Evolving terminology: It is notable that NCII has come to replace the outdated term “revenge porn:
      •  “Revenge” is misplaced: Revenge typically involves harming someone in response to perceived wrongdoing. Labelling it as “revenge” implies that the victim or survivor initiated harm deserving retribution. Additionally, perpetrators are not always motivated by revenge, they may be acting out of spit, or out of a desire for profit, notoriety, or entertainment.
      •  “Pornography” is misplaced: Using the term pornography implies victims or survivors are seemingly consenting porn actors. It further “turns a harmful act into a form of entertainment”.

    Intermediaries and NCII

    Given that NCII are often shared on platforms and websites considerations around the role of intermediaries come into play, more specifically, intermediary liability which refers to the practice of holding internet intermediaries liable for content published on their platform.

    In sub-Saharan Africa, several countries have enacted laws around intermediary liability including Ghana,(7) Uganda,(8) and Kenya.(9) In South Africa, for example, Chapter 11 of the Electronic Communications Act, 2005 requires members of the Internet Service Providers Association to take down content upon receiving take-down requests.

    Concerns have emerged, however, about the use of take-down procedures to entrench censorship and disproportionate power being given to private companies to moderate free speech.(10) As online violence often occurs on social media platforms such as Facebook, X, or Instagram, it is important to understand the role of the platforms in protecting users from such harms. While platforms are not required to regulate speech on the platform, they are responsible for taking measures to keep their users safe, especially because they provide terms and conditions of use that do not allow content that violates users’ trust or safety.  

    Litigation in India serves as a useful illustration of intermediary accountability in the context of NCII. In Mrs X v Union of India (2023), the Delhi High Court required intermediaries to remove all NCII of Mrs X (a victim of NCII) not just the links Mrs X had provided. The Court analysed the involvement of intermediaries in removing NCII, noting that while the “originators” who initially publish the content bear responsibility for uploading it, intermediaries are involved in its dissemination and continued presence online. The Court held that Indian legislation mandates intermediaries to exert “reasonable effort” to prevent users from sharing unauthorised or obscene content and that intermediaries must make use of technology to remove reposts of offending images.(11)  

    International law and standards on NCII

    As with online harms in general several human rights are implicated when it comes to NCII:

    • Freedom of expression: NCII can and has been used as a tactic to shame and harass women journalists around the world and thereby discourage critical reporting or shut down freedom of expression. Even where it is not shared to shame or stigmatise victims into silence and self-censorship intentionally, individuals can and do use nudity, depictions of sex, or eroticism as a “private demonstration of sexuality” or to “express their artistic, journalistic and academic freedoms,”(12) and non-consensual dissemination undermines and punishes this valid expression.
    • Privacy, dignity, and freedom from violence: In 2018 and 2020, the UNSR on VAW observed that the “publication or posting online without the consent of intimate photographs or photoshopped images that are sexualised” violates the subject’s rights to privacy, to dignity, and to live a life free from violence(13) and that this emerging form of online violence “defames and silences women journalists.”(14) NCII also implicates sexual expression. According to the World Health Organisation (WHO), “sexual rights protect all people’s rights to fulfil and express their sexuality and enjoy sexual health.”(15)

    As noted above, and in Module 1, these rights are protected in several instruments and guiding documents in international human rights law. Obligations arise for both states and the private sector:

    • States are required to, among others, create conditions for the effective investigation, prosecution, and protection of attacks against journalists as part of the mandate for protecting and promoting freedom of expression.
    • The United Nations Guiding Principles on Business and Human Rights (UNGPs) place positive responsibilities on private sector actors, including businesses and corporations, such as private social media companies and intermediaries through which many of these abuses flow, to mitigate the human rights impacts of their operations, publish transparency reports, and provide remedies for potential human rights violations.(16)

    At the regional level, while the African Union Convention on Cyber Security and Personal Data Protection (Malabo Convention), which came into effect in 2023, has been faulted for failing to specifically provide for the offence of NCII,’(17) its data protection provisions can also provide some measure of protection if properly implemented at the domestic level.

    In addition, the African Commission on Human and Peoples’ Rights (ACHPR) in the Declaration of Principles on Freedom of Expression and Access to Information in Africa affirms that NCII is a punishable offence emanating from the “harmful sharing of personal information.”(18) Despite the Declaration being a soft law, this provides a persuasive indication of the linkage between the right to informational privacy and this particular manifestation of online violence affecting journalists.

    National laws on NCII

    Numerous states, including in Africa, have passed, or are attempting to pass domestic civil and criminal laws to provide legal solutions for NCII, either as a form of sexual abuse or harassment or as a privacy violation, albeit with varying degrees of success.

    NCII: Examples of legal protections

    Here is an overview of legal frameworks on NCII in three sub-Saharan African countries:(19)

    • Kenya: The Computer Misuse and Cybercrimes Act (CMCA), 2018 establishes various digital and technology-facilitated offences, including cyber-harassment in section 27 and the “wrongful distribution of obscene or intimate images” in section 37. However, the broad wording of the provision criminalises the sharing of all intimate images, a framing that could have the unintended effect of deterring victims from reporting cases of NCII. Since 2018, this legislation has been the subject of judicial contestation, including an order suspending the operation of sections 27 and 37 in 2018(20) which was subsequently overturned in 2020.(21) The matter is reportedly being appealed before the Court of Appeal.(22)
    • South Africa: Various pieces of legislation are relevant to NCII. The Cybercrimes Act, 2020, in section 16, criminalises the unlawful and intentional disclosure of a data message of an intimate image of a person if the subject retains a reasonable expectation of privacy, the message violates the sexual integrity or dignity of the person or amounts to sexual exploitation, and without that person’s consent, and includes within its scope both real and simulated intimate images. In addition, the Film and Publications Amendment Act, 2019, creates the offence of knowingly distributing private sexual photographs and films without consent in any medium with the intent to cause the subject harm (section 24E). The Protection of Personal Information Act, 2013 (POPIA) may also provide some protection in the form of seeking relief for damages against a perpetrator for data protection violation. Lastly, the Protection from Harassment Act, 2011, enables victims and survivors to apply for protection orders and the common law crime of crimen inuiria can be used in cases involving the wilful impairment of a person’s dignity and privacy. Commentators have also expressed concern about potential loopholes in the relevant legislation, particularly around intent to do harm and the definition of private images.(23)
    • Malawi: In Malawi, although no specific legislation exists, a patchwork of laws may provide some limited protection for victims and survivors. For example, the Electronic Transactions and Cybersecurity Act, 2016 criminalises cyber-harassment (section 86), offensive communication (section 87), and cyber-stalking (section 88). However, the broadness of these provisions may also have negative consequences for freedom of expression online, and implementation of the law has proven challenging with many women facing difficulties in reporting these crimes to the police.(24) Notably, Section 30 also sets out the responsibilities of intermediary service providers to take down content that is unlawful or violates rights.(25) Section 137 of the Malawi Penal Code, 1930 also criminalises “insulting the modesty of a woman” and the Gender Equality Act, 2016 prohibits “harmful practices… on account of sex [or] gender” although these vague provisions may also have negative side-effects.(26)  

    Many of these laws raise challenges for ensuring accountability for victims and survivors:

    • Laws dealing with NCII usually prioritise intent when determining whether a human rights violation or civil or criminal offence has occurred, which can be a steep evidentiary burden for victims and survivors.(27)
    • Sometimes, perpetrators may act without aiming to hurt the subject.(28)
    • Many do not address threats to release a certain image or video but only the actual release itself.(29)
    • Developing appropriate legal responses to address NCII is further complicated by the fact that recent technological advancements have “opened the door to new forms of abuse” which include the use of artificial intelligence to create images at scale and which creates challenges for tracing origin and removal.(30)
    • Further, even where legal recourse can be achieved against the primary distributor, a long chain of others who redistribute, view, or engage with these images may be created which makes permanent removal and full accountability exponentially difficult.(31)

    An alternative argument is that intimate images are protected under a moral right of copyright, which allows individuals to:

    • claim authorship of a photo or video, and
    • enforce the right to prohibit or authorise the distribution of a photo or image.

    This argument draws on the Berne Convention for the Protection of Literary and Artistic Works and Article 27 of the UDHR, which protects “the moral and material interests resulting from any scientific, literary or artistic production of which he is the author.”(32) However, in using such a copyright approach, which may be the only viable option for some social media platforms, victims or survivors have sometimes been required to prove that they hold copyright over the images prior to removal by intermediaries.(33)

    Global approaches to NCII

    Cases around the world have demonstrated the various approaches to seeking accountability for incidents of NCII. For example, in the case of Holly Jacobs vs. Ryan Seay & Others (2014) in the Circuit Court of the Eleventh Judicial Circuit in Florida, United States, a woman initiated a claim relying on the intentional infliction of emotional distress, which required demonstrating a lack of consent and the intention by the abuser to cause emotional distress.

    In Khadija Ismayilova v Azerbaijan (2019) the European Court of Human Rights (ECtHR), it was held that Azerbaijan had violated the right to privacy and freedom of expression of a journalist in a matter involving the online dissemination of intimate videos recorded covertly in her bedroom. The Court held that the failure by the state to properly investigate the crimes constituted a failure in its positive obligations to protect her journalistic freedom of expression and her private life.  

    These cases illustrate that different legal routes are available in NCII claims and that different rights are implicated.

    Others have relied on a breach of confidentiality, a well-established legal concept, by demonstrating an express or implied breach of confidentiality. An implied breach would focus on whether trust has been breached, rather than the “private or offensive” nature of the distributed information.(34)

    Case note: Litigating Non-Consensual Distribution of Images

    In 2016, the High Court of Kenya determined a case, Roshanara Ebrahim v Ashleys Kenya Limited & 3 others (2016) involving the non-consensual distribution of the petitioner’s nude photographs by an ex-boyfriend, resulting in her dethronement as Miss World Kenya 2015.   The Court held that Ebrahim had a legitimate expectation of privacy, that she did not waive her right to protection of privacy by taking nude photographs and did not consent to their dissemination to third parties, and as such, her right to privacy under Article 31 of the Constitution of Kenya had been violated. It further ordered the ex-boyfriend to pay damages and directed the organisers of the Miss World Kenya not to publish the nude photographs in their possession.  

    The case provides valuable insights into the ‘reasonable expectation of privacy,’ whether images are obtained in an intrusive manner, and whether the presence of illegalities may invalidate a right to privacy claim.(35)

    Finally, in states where NCII is not criminalised, the options are limited to other crimes, such as stalking, harassment, unlawful surveillance, or the dissemination of child pornography.


    1. Suzie Dunn ‘Technology-Facilitated Gender-Based Violence: An Overview’ (accessible at at 8. Back
    2. Suzie Dunn and Alessia Petricone-Westwood, ‘More than ‘revenge porn’: Civil remedies for the non‑consensual distribution of intimate images,’ (2018) (accessible at Back
    3. CIGI ‘Non-Consensual Intimate Image Distribution: The Legal Landscape in Kenya, Chile and South Africa,’ 2021 accessible at Back
    4. Cyber Rights Organisation, ‘NCII: 90% of victims of the distribution of non-consensual intimate imagery are women,’ (accessible at Back
    5. CIGI, above n 34. Back
    6. Id. Back
    7. Section 92 of Ghana’s Electronic Transactions Act of 2008 (accessible at Back
    8. Section 29 of Uganda’s Electronic Transactions Act of 2011 (accessible at Back
    9. The Copyright Act, CAP 130, Section 35B (accessible at 130). Back
    10. Godana Galma, ‘Digital Rights Implication of the Copyright (Amendment) Act 2019’, (2020) (accessible at Back
    11. See Global Expression, ‘Mrs X v Union of India (2023) (accessible at for more details. Back
    12. ARTICLE 19, ‘Kenya: Withdraw proposed amendments to cybercrimes law‘ (2021) (accessible at Back
    13. UNSR on VAW Report on online violence above n 5. Back
    14. UNHRC ‘Combating violence against women journalists: Report of the Special Rapporteur on violence against women, its causes and consequences’, (2020) (accessible at Back
    15. WHO, ‘Developing sexual health programmes: a framework for action,’ (2010) (accessible at Back
    16. UN Guiding Principles on Business and Human Rights (accessible at Back
    17. CIGI, above n 34. Back
    18. Principle 42, Declaration of Principles on Freedom of Expression and Access to Information in Africa, (accessible at of Principles on Freedom of Expression_ENG_2019.pdf). Back
    19. Sarai Chisala-Tempelhoff & Monica Twesiime Kirya ‘Gender, law and revenge porn in Sub-Saharan Africa: a review of Malawi and Uganda’, (2016) (accessible at; CIGI, above n 34. Back
    20. CIPESA, ‘Promoting Best Practice among Activists for More Effective Collaboration in Digital Rights Litigation in Kenya,’ (2019) (accessible at Back
    21. Digital Space Case Digest, ‘Civic Space Protection Platform,’ (accessible at Back
    22. Id. Back
    23. Schindlers, ‘South Africa Cracks Down on Revenge Porn,’ (2020) (accessible at Back
    24. African Feminism, ‘Accessing Justice for Image-Based Sexual Abuse A Challenge For Victims in Malawi,’ (2020) (accessible at Back
    25. Seonaid Stevenson-McCabe and Sarai Chasala-Tempelhoff, ‘Image-Based Sexual Abuse: A Comparative Analysis of Criminal Law Approaches in Scotland and Malawi,’ (2021) (accessible at Back
    26. Id. Back
    27. Foreign Policy ‘The World Hasn’t Figured Out How to Stop ‘Revenge Porn’, (2021) (accessible at Back
    28. CCRI (accessible at Back
    29. UNHRC, ‘Right to Privacy: Report of the Special Rapporteur on the right to privacy’ (2019) at para 71 (accessible at Back
    30. Suzie Dunn ‘Identity Manipulation: Responding to Advances in Artificial Intelligence and Robotics’, (2020) (accessible at; Suzie Dunn ‘Technology-Facilitated Gender-Based Violence: An Overview’, 2020 (accessible at no 1_0.pdf). Back
    31. McGlynn, Clare and Erika Rackley, ‘Image-Based Sexual Abuse’, (2017). Back
    32. Article 27, Universal Declaration on Human Rights. Back
    33. Foreign Policy ‘The World Hasn’t Figured Out How to Stop ‘Revenge Porn’ (2021) (accessible at Back
    34. Woodrow Hartzog ‘Reviving Implied Confidentiality’ (2013) (accessible at Back
    35. For further information on the use of the ‘tort of invasion of privacy,’ the public disclosure of embarrassing facts, breaches of the torts of breach of confidence and intentional infliction of mental distress, see: Jane Doe 464533 v. D. (N.) (accessible at; See also: Equality Project ‘Technologically-Facilitated Violence: Non-Consensual Distribution of Intimate Images Case Law’, January 2019 (accessible at Back