From the recent three decisions of Pakistan’s apex court to the wildfire spread of AI-generated deepfakes, the entire spectrum, digital as well as physical, has become a persistent battlefield for women within a deeply skewed patriarchal society.
But, given the ease, scale, and anonymity of digital spaces, this mutated hydra of gender-based violence (namely technology-facilitated gender-based violence or TFGBV) – encompassing harassment, stalking, impersonation, image-based abuse, deepfakes, doxxing, coordinated online attacks and more – is burgeoning frantically.
The scale of the problem is global: 85 per cent of women have experienced or witnessed online violence, and 95 per cent of AI-generated deepfakes circulating online are obscene depictions of women. Online harm also spills into the physical world; one-fifth of women who have experienced online violence report being attacked offline. As technology continues to evolve, the methods used to perpetrate TFGBV are likely to grow in both sophistication and reach.
Locally over the past several years, data shows that approximately 1.8 million women encountered cyber harassment or bullying, while 2.7 million cybercrime-related complaints were received nationwide. Nearly 80 per cent of these complaints involved women and children. Nonetheless, only 8,000 complaints were formally registered as cases, leading to just 225 convictions. The gap between harm experienced and justice delivered is stark. In one case, an 18-year-old Pakistani girl was killed in the name of ‘honour’ after a photoshopped image of her went viral on social media. Resultantly, online misogyny and TFGBV can fuel radicalisation and violent extremism, serving as recruitment tactics for groups that seek to roll back women’s rights.
There are two dimensions of this malady, as also highlighted by a recent UNFPA study. First, direct TFGBV, involving a clear perpetrator with mens rea, is relatively straightforward to identify. It includes overtly harmful acts such as image-based abuse, blackmail, cyberstalking, sextortion and doxxing. Mostly, it is considered crime under the law of the land.
Second, the structural TFGBV, which is more complex and harder to identify. It operates through pervasive behavioural patterns that make digital spaces hostile to women, perpetuating marginalisation and exclusion, reflected in a stark digital gender gap. This may include constant sexist harassment, gender-based trolling, misogynistic commentary, and coordinated abuse campaigns. It enables the former, may at times not be criminalised and is addressed under different regulatory regimes.
And unsurprisingly (and sadly), we are failing to curb this predicament at both aforementioned dimensions because of two evils: lack of robust legislation and feeble implementation. While the law under scrutiny, the Prevention of Electronic Crimes Act (PECA) 2016, with amendments introduced in 2023 and 2025, has expanded institutional structures, it has not resolved core legal gaps and, thereby, crimes against women online persist. Instead, as critiqued by legal experts, time and again, the cloud of vagueness in the law is being exploited to axe dissent, rather than to strangle the perverse TFGBV.
The sections most frequently invoked in cases of TFGBV reveal these limitations. Section 20, dealing with offences against dignity, criminalises the public dissemination of false or harmful information. It is the most commonly used provision in digital harassment cases, yet it does not cover threats of dissemination, a critical omission, given that blackmail causes serious psychological harm and coercion.
Section 16, addressing identity misuse, criminalises impersonation and doxxing but is non-cognisable, requiring a court order before investigation can begin. Section 21, concerning offences against modesty, includes sexually explicit content and now captures deepfakes; it is cognisable and non-compoundable, yet again fails to cover threats where images are never released.
The law’s focus remains firmly on dissemination rather than on coercion, intimidation, cumulative harm, or even data protection (how personal data is collected, stored, shared, or misused in the first place). Beyond the text of the law, jurisprudential ambiguity further muddles the scene: with new platforms in place – the Social Media Protection and Regulatory Authority and the Council – which complaint will go where remains labyrinthine.
The second hydra, feeble implementation, is also pitting to challenges, further undermining the legal intent. The National Cyber Crime Investigation Agency (NCCIA), successor to the Cyber Crime Wing of the Federal Investigation Agency (CCW-FIA), operates just 15 cybercrime reporting stations for a population of 250 million. Mandatory in-person verification remains a significant barrier, particularly for women facing stigma, mobility constraints or threats.
Further, capacity issues persist. In 2024, 135,000 complaints were filed – around 900 cases per investigation officer – with 65 per cent producing no outcome. Online harassment accounted for 459 FIRs, hardly peanuts given the 20,000 complaints of digital violations handled by DRF from 2016–2024. Plus, fueling the fire, the evidence storage infrastructure remains feeble, igniting grave concerns about data leaks, especially in cases involving sensitive images. The geographic disparities compound these challenges. In parts of Balochistan, vast distances and weak institutional presence have resulted in stations reporting zero FIR conversions despite complaints.
Given this dismayed situation and multidimensional structural predicament, for an uneducated household, the only remedy seems to restrict digital presence altogether. This further incites the patriarchal fabric of the society; ergo, marginalisation of the already marginalised, both in the physical as well as digital world, and thereby welcoming the vicious cycle of violence and harassment evermore.
What is the way forward then? A stronger, survivor-centred response to TFGBV demands legal reform, effective enforcement and sustained prevention. Legal priorities include updating Pakistan’s cybercrime framework by introducing a dedicated TFGBV chapter within PECA that clearly defines emerging harms, enables proportionate penalties and explicitly criminalises threats of dissemination, so coercion and extortion are actionable even before content is shared.
The data protection regime should also be strengthened along the lines of the European Union’s General Data Protection Regulation (GDPR) and the UK’s Data Protection Act 2018, and now the Data Use and Access Act 2025 (DUAA). Enforcement should be strengthened by ring-fencing dedicated funding, in parallel with expanding access to cybercrime services, ideally establishing at least one cybercrime police station per district, improving staffing and investigative capacity, creating specialised TFGBV units with meaningful female representation and trauma-informed procedures.
Survivor protection must be embedded throughout the system, with robust protocols for privacy, evidence handling, and safe reporting that reduce re-traumatisation and prevent further exposure of harmful content. Platform accountability is also essential, particularly in high-risk cases. This requires enforceable obligations for rapid platform cooperation in emergencies, such as intimate image abuse, deepfakes and credible threats, alongside clearer reporting pathways, timely responses and transparent case handling within the regulatory framework.
Prevention and coordination are critical to reducing TFGBV at scale. Sustained public education, especially programming that engages men and boys, should be paired with national digital citizenship initiatives that address online consent, deepfake literacy and the real-world consequences of online abuse. Implementation can be strengthened through a national TFGBV task force and deeper regional and international collaboration, particularly among Global South countries, to shape shared standards, accountability and solutions.
Community-led approaches can further complement these policy shifts. A case in point is the Dost Initiative, founded by Uswa Maryam, which convenes study circles to promote adult literacy and support social and emotional learning. Its project, ‘Mehfooz Phir Se – Safe Again’, focuses on reimagining what safety means in practice, aiming to spark critical cultural conversations around consent, boundaries and breaking the silence on gender-based violence, including TFGBV, with support from psychotherapists, educators, students and activists.
Notwithstanding, ultimately, confronting TFGBV requires treating online harm as real harm, one that can escalate quickly, silence participation and spill into everyday life. Laws, enforcement and platform duties are essential, but ending it requires shifting the norms that enable abuse.
With survivor-centred systems and prevention, digital citizenship, community education and consent-based culture, safety becomes a shared responsibility. The question now is whether institutions, companies and communities will act fast and together.
Furqan Ali is a Peshawar-based researcher who works in the financial sector. He can be reached at: [email protected]
Arfa Ijaz is an energy researcher working at the Sustainable Development Policy Institute (SDPI), Islamabad.