ISLAMABAD: While Pakistan’s National AI Policy 2025 prioritises innovation and growth, it lacks enforceable safeguards to protect rights, allowing monitoring, censorship, algorithmic bias, and technology-facilitated harms to flourish unchecked, revealed a study “Emerging Technologies in Pakistan: Towards a People-Centred Policy Framework” conducted by Digital Rights Foundation (DRF).
The findings of the study were discussed at a high-level national roundtable on ‘Emerging Technologies in Pakistan,’ bringing together over 40 stakeholders from the government, media, civil society, international organisations, and the technology sector. The roundtable was aimed at highlighting the growing gap between rapid technological adoption and human rights protections in the country.
Drawing on 11 focus group discussions with 79 participants, 60 survey responses, and expert interviews, the study captures how people on the ground are already experiencing the impacts of AI. As many as 65 percent of participants feared loss of privacy, 63 percent expressed concern about disinformation, and half worried about AI-enabled monitoring. Despite 84 percent of Pakistani newsrooms using AI tools, only 12 have formal policies guiding ethical use. Research participants also shared alarming real-world examples, including women targeted by AI-generated deepfakes, AI-fuelled disinformation exacerbating sectarian violence in Kurram, and journalists relying on tools like ChatGPT and DALL-E while fearing job displacement and automation bias. Many expressed a sense of fatalism, “AI is already here; we just have to deal with it.”
The roundtable brought together representatives from the Ministry of Information and Broadcasting, the Ministry of Human Rights, PTA, NCCIA, NCHR, NCRC, leading media houses, journalist unions, civil society organisations, UN agencies, diplomatic missions, and the tech sector.
The EU delegation to Pakistan was represented by Jeroen Willems, Head of Cooperation, underscoring the importance of international partnership in rights-based digital governance. He said, “DRF’s research and convening role are particularly important in grounding policy discussions in local realities and lived experiences, especially those of journalists, civil society actors, and marginalised communities who are often most impacted by these technologies.”
DRF Founder Nighat Dad remarked, “Pakistan cannot afford to treat AI as inevitable or copy-paste Global North regulations. We need governance built from the margins; centering journalists facing censorship, women targeted by deepfakes, and workers displaced by automation. A people-centred approach is not optional: it is urgent.”
The discussion concluded with a consensus that Pakistan must move beyond techno-solutionism towards human rights-centred, participatory governance rooted in Global South realities. Immediate priorities identified include mandatory human rights impact assessments for AI deployments, passage of long-delayed data protection legislation, transparent content moderation frameworks, participatory AI oversight bodies, protections for journalists and workers, and environmental accountability for AI infrastructure.