close

DRF report

By Editorial Board
April 19, 2026
High school student poses with her mobile showing her social media applications in Melbourne. — Reuters/File
High school student poses with her mobile showing her social media applications in Melbourne. — Reuters/File

The latest annual report by the Digital Rights Foundation makes clear that online abuse in Pakistan is evolving faster than the systems meant to contain it. The main factor is artificial intelligence, which is transforming harassment into something more scalable, more anonymous and far harder to trace. The most disturbing consequence: children, some as young as six, are now being pulled into an ecosystem of harm that the state is woefully unprepared to confront. The numbers alone are alarming, but they do not capture the full extent of the crisis. Even as 79 per cent of cyber harassment cases are referred to the National Cyber Crime Investigation Agency (NCCIA), access to justice remains uneven and, in many cases, out of reach. For survivors outside major cities, the process is often prohibitive – requiring travel, resources and resilience that many simply do not have. What the report highlights is a deep structural failure. Younger children may constitute a smaller proportion of complaints, but the severity of risks they face – grooming, sexual exploitation and AI-enabled abuse – is profound.

Women, meanwhile, continue to bear the heaviest burden of this digital violence. In a society where their presence is already policed in physical spaces, the online world has become an extension of the same scrutiny and control. Non-consensual image sharing, blackmail and sextortion are part of a continuum of gendered harassment designed to silence, shame and intimidate. Women journalists, in particular, are frequent targets, their professional work met with orchestrated abuse meant to push them out of public discourse. Recent episodes of online trolling over something as trivial as a woman’s choice of clothing illustrate how quickly digital spaces can turn hostile. Private images are weaponised, doctored and circulated to enforce a narrow moral code. This is a form of social control that reflects deeper anxieties about independent and outspoken women. Compounding these challenges is a lack of awareness and support. Not all victims understand how to report cybercrimes or protect themselves online. Many lack the familial or institutional backing needed to pursue justice.

The DRF’s recommendations are urgent. Strengthening law enforcement capacity, streamlining reporting mechanisms – especially for minors – and integrating psychological support services are essential first steps. Pakistan’s data protection regime must be robust enough to address AI-driven harms, not merely conventional cyber offences. Equally important is investing in digital literacy so that users, particularly younger ones, can better navigate online risks. Technology companies too need to get active. AI moderation systems must be attuned to local languages and contexts, and trusted flaggers should be given greater weight in identifying harmful content before it spreads uncontrollably. The scale of the problem demands shared responsibility between the state, platforms and civil society. The fact is that the harassment women face online mirrors the inequalities they confront offline. So, while AI may be accelerating the problem, it did not create it. Without decisive intervention, Pakistan risks allowing its digital spaces to become breeding grounds for exploitation, fear and silence.