Home » The Need to Address Image-Based Sexual Abuse in a Digital Sri Lanka

The Need to Address Image-Based Sexual Abuse in a Digital Sri Lanka

Source

Photo courtesy of UNFPA

Towards the end of 2025, Sri Lanka received $50 million from the World Bank to accelerate its digital transformation. Close to 60% of the population currently uses the internet with around nine million social media identities recorded in October 2025. Anticipating a future of increased digitalisation, internet connectivity, social media use and the use of AI tools, it is important to consider how harmful gendered social dynamics are finding a home online. What happens when rapid digital transformation and uptake take hold in a context of weak policy coordination, low digital literacy and poor platform accountability? What do these trends mean for sexual and gender-based violence and response?

Technology-Facilitated Gender-Based Violence (TFGBV) takes many forms, including cyberbullying, online harassment, hate speech, doxing, intimate image abuse, trolling or deep fakes. Image-based sexual abuse is one form of this and refers to the non-consensual sharing of nude images online, which has been a trend across Facebook pages, WhatsApp groups, Telegram channels and certain websites. Such violence disproportionately affects young women, girls, queer-identifying people, women with disabilities and racialised, minoritised and migrant groups of women, reflecting hierarchies and harmful norms already prevalent offline in society.

How is image-based sexual abuse being dealt with?

Established in 2013, The Grassrooted Trust started receiving increasing reports from students about being blackmailed beginning in 2015, mainly over the non-consensual sharing of nude images and videos that had previously been consensually shared with partners and ex-partners. “This very quickly snowballed into what we recognised as a model – the proliferation of photos and videos on websites, some exclusively Sri Lankan, and we were seeing the fallout from that,” said co-founder Hans Billamoria.

The Grassrooted Trust supported people to report cases to the CID, helped those targetted to talk to parents and talked to parents themselves to help take the sting out of blackmail. Once parents were on the side of the person (their child) being blackmailed, the person doing the blackmail lost leverage and would, in many cases, back off.

In 2021, there came an effort to simplify the reporting process for image-based sexual abuse through Delete Nothing, a trilingual web-based tool to document, provide resources for and support cases of such abuse. Acknowledging how traumatising and re-traumatising it was for those experiencing the non-consensual sharing of their images online to retell their story to various agencies while seeking redress, they aimed to create a tool for people to disclose their experience in a standardised form. The person reporting could then download their case record (available only to them using a reference number) that remained anonymous and without identifying information on the organisation’s side. This could then be carried to other points of contact in the reporting journey to support victims to navigate a confusing system with limited technical capacity, gender sensitivity or procedural consistency. It was a welcome step in simplifying the reporting process.

This went a step further in 2022 with the Prathya hotline, established by Delete Nothing and Hashtag Generation (now run by the latter), which provided a direct line to streamline the reporting process for people dealing with image-based sexual abuse. Receiving over 100 reports between 2022-2023, responses to the hotline have highlighted the need for effective mechanisms to support victims of image-based sexual abuse.

In response to these and other concerns, the Online Safety Act (2024) (OSA) saw to the establishment of an Online Safety Commission to regulate content and prosecute harmful communications. However, arguments have been made that while couched in the language of protecting women and vulnerable users, the Act prioritises political control and censorship.

Without clear mechanisms in place to deal with image-based sexual abuse, many go to women’s organisations, NGOs working on sexual health and education initiatives and the Sri Lanka Computer Emergency Readiness Team (SLCERT) which, having received so many reports, has established a voluntary task force to deal with TFGBV. However, reports to organisations that might not work specifically on this can go unaddressed due to a lack of clear reporting pathways. While organisations such as The Grassrooted Trust have direct links to Meta and can report and expedite the removal of non-consensual image sharing on Facebook, Instagram and Whatsapp, their influence is limited with other platforms such as Telegram or once images circulated on these platforms are downloaded.

Strengthening legal processes

While the OSA (Section 20) addresses online harassment, including the nonconsensual sharing of private information, Sri Lanka’s legal framework currently falls short on specific provisions for consent-based image sharing. In reporting cases of gender-based violence, women often experience long delays in case completion, a lack of suitable courts, a lack of legal aid and importantly a lack of empathy by those in charge of legal processes. Poor systems of accountability and further harm and humiliation in the reporting process result in low rates of reporting with many having lost faith in seeking legal redress.

While the Prathya hotline offered callers technical resolution (speeding up reporting and getting images removed off platforms such as Facebook), psychosocial support once the image was taken down and legal aid to report cases. Dr. Misha’ari Weerabangsa, Co-Lead of Programmes at Delete Nothing, described how users largely requested technical support but would decline psychosocial support, revealing how many considered it as reflecting negatively on themselves and anticipating stigma-laden responses from family. Callers were also not keen to pursue legal redress with the few who filed cases eventually pulling out due to cases dragging on and taking over their lives. These responses highlight the stigma around mental health and the deterrent effects that slow and inefficient legal processes have on those who might otherwise opt to report abuse.

Constraints such as laws criminalising queer-identifying people, a lack of marital rape legislation, gendered norms around sexual morality and common beliefs that what happens within the home should remain private and unavailable to public scrutiny in cases of intimate partner violence, reduces the urgency of investigation and justice seeking behaviours. Within structures and cultures of impunity and victim blaming, the onus is often placed on victims to such crimes to keep themselves safe, which only perpetuates this cycle of unpunished violence that goes unreported and unaddressed.

Shifting the blame from victim to perpetrator

Those whose image is non-consensually shared usually receive the blame and shame that follows rather than the onus and sanction put on the person non-consensually sharing the image. This harmful attitude extends to safeguarding agencies and law enforcement with reports of police officers shaming complainants for having images of themselves on their phone and shifting the burden of investigation onto them. Research in Sri Lanka finds that the result of tech-facilitated image abuse includes harmful effects on victims’ reputations and their families’ reputations that could negatively affect marriage prospects.

While the phenomenon of an aggrieved partner non-consensually sharing nude images of an ex-partner has been referred to as revenge porn, there have been calls to use the term image-based sexual abuse to more accurately reflect where harm has occurred. Rather than focusing on perpetrators’ motivations, the phrase draws attention to the harm caused to those whose images have been shared without consent. It also widens the scope of non-consensual image sharing from aggrieved ex-partners to hackers, strangers and those who have got their hands on stolen devices among others.

Rather than new practices, image-based sexual abuse is an extension of sexual abuse that existed prior to the availability of technological platforms. It reflects violence based on gender, ethnicity and religious discrimination that relies on a culture of victim blaming and shame.

Supporting young people to think critically and develop ethical relational skills is a critical to prevent such harm from continuing. Addressing legal processes to ensure they are accountable and survivor centered is crucial yet there is also work to be done when it comes to the norms that enable and normalise such violence.

Education and cultural change

The NPP’s proposed education reforms include fostering a humanistic and responsible society that respects cultural diversity, implementing an integrated mechanism to prevent sexual or other forms of harassment in tertiary education institutions and programmes to identify and address behaviours of school children who may lead to anti-social activities and are vulnerable to risks. Billamoria comments that they are making the right noises, yet how and where such changes are made remains to be seen.

Dr. Weerabangsa notes that such issues can’t just be addressed through trainings about rights. These issues are rooted in widespread misogyny, homophobia, transphobia and harmful stereotypes that have arisen from and fuelled ethno-religious tensions with Muslim women and Tamil women experiencing identity-based hate compared to Sinhala women.

It is crucial that education systems support young people to think critically and navigate social landscapes both online and offline ethically. While NGOs like Delete Nothing, Hashtag Generation and The Grassrooted Trust and government agencies such as SLCERT are providing essential services to counter the harm that comes from unethical practices online, it is important to address the social values that underpin such harm and the way they are reproduced on social media. Addressing tech-facilitated sexual and gender-based violence becomes a part of a broader conversation on fostering social – gender, sexuality and ethno-religious – justice. It calls for education that challenges misogyny and identity-based hate without which Sri Lanka’s increasingly digital, technological future risks deepening existing issues.

While proposed education policies are promising, there are questions as to what planning and implementation will look like. What will such plans look like for curriculum, pedagogy and integration into subject areas? With parents and religious leaders influencing and often preventing relationships and sex education implementation, what will stakeholder involvement and engagement look like? These are critical questions to consider as Sri Lanka becomes increasingly digital, which will require greater safeguarding measures to ensure that trends like image-based sexual abuse are effectively tackled.

What’s your Reaction?
0
0
0
0
0
0
0
Source

Leave a Comment


To prove you're a person (not a spam script), type the security word shown in the picture.
You can enter the Tamil word or English word but not both
Anti-Spam Image