Imagine waking up one morning to find that your phone – the only object you truly confide in – has quietly acquired a new inhabitant.
A government-issued, pre-installed, undeletable app that sits inside your device with the confidence of a tenant who knows the landlord won’t evict them. No opt-outs. No consent forms. No polite “Allow only while using the app”. Just a silent piece of surveillance architecture humming beside your messages, your late-night searches, your entire behaviour catalogue, your location history and every photo you swore you deleted.
Ever chuckled at the ‘FBI wants to know your location’ meme? Imagine they already do. And now imagine being told, with a completely straight face, that this is for your safety.
If this thought feels slightly dystopian, slightly too close to home, slightly like the opening scene of a show you would binge but never want to live in – good, because this is no longer a hypothetical. In India, the government has instructed smartphone manufacturers to preload a cyber-safety app on all new phones. A seemingly harmless step, branded as theft prevention, fraud detection, SIM verification, national security. All good things. All things citizens would, in principle, want.
Satisfied because you read the brochure? Here is the mind-bender: It’s the footnotes that get you.
Because once the state walks through the front door of your phone, the issue is bigger than ‘What will they do today? It is ‘What won’t they do tomorrow?’ Surveillance is like power. And power does not shrink; it metastasises.
Like everything else that becomes the norm for the rest of the world after our Western forefathers give it a shot, surveillance is not far behind. Yes, major democracies have already snooped harder than an aunt at a wedding.
The UK once ran a programme that did something very basic and very unsettling: it quietly collected huge amounts of data about which websites people visited and when. Nothing fancy. Just a giant list of ‘who browsed what’. It was called ‘Karma Police’ – perhaps ‘Trust Issues 101’ was taken – and its goal was embarrassingly straightforward: track everyone, all the time, just in case someone became interesting later.
The Americans weren’t far behind. One of their programmes automatically vacuumed up text messages from around the world – birthday wishes, OTP codes, breakup paragraphs, the whole buffet.
And yet, here lies the uncomfortable truth: the case for surveillance is compelling. Imagine, for a moment, the possibilities of a well-designed tracking system. Imagine being able to identify terrorists the moment they coordinate. Imagine police tracking and intercepting criminals who commit a violent offence and vanish into the night. Imagine finally pinpointing the perpetrators of mob violence – the one crime hides behind anonymity. Imagine a society where fleeing the scene no longer secures impunity.
The benefits are innumerable. They promise the kind of public welfare Pakistan desperately needs but chronically lacks. And yet, the cost is also immense. Your data is the currency. Your privacy is the collateral. So, what do we value more? A society possibly free of crime? Or a society where your private digital life remains untouched? Safety or solitude? The elimination of violence or the protection of personal space?
Because consider what comes next. Mandatory tracking apps are merely the beginning. The next evolution is the integration of AI into these systems. If you are thinking the friendly, chatty kind, think again. Instead, it will be the forensic, pattern-detecting, behaviour-mapping kind. Apps that monitor movement and infer intent. Government dashboards that go beyond showing your location; they start predicting it. Soon, AI-infused surveillance technology will prevail, one that builds psychological profiles, identifies risks and flags individuals based on patterns invisible to the human eye.
Then, this will no longer be our plain old surveillance. It will be precrime. And it is closer than we think.
But there is a small, precious loophole. One that hasn’t been fully explored but might, for once, work in favour of privacy. Some time ago, Sam Altman – the man behind OpenAI – said that talking to AI should be a privilege. A human right of sorts. If that philosophical framing ever mutates into a legal standard – if conversations with AI, digital footprints, photos and behavioural data become protected in the way attorney-client communication or medical records are protected – then perhaps this era of supercharged surveillance
may not be the death of privacy after all.
Governments could build surveillance systems to catch terrorists, mob lynchers and runaway criminals, and your personal digital life could still sit behind a velvet rope marked ‘Do Not Enter’. AI does the sifting; human authorities only see red flags, not your midnight selfies or unhinged WhatsApp rants. Surveillance without possible voyeurism.
Perhaps the future will not be a tug-of-war between liberty and safety, but a carefully balanced structure in which technology advances without swallowing the citizen whole.
But Pakistan is nowhere close to that conversation. We are entering a technological future with political reflexes still stuck in the 1990s. Every government, without exception, is becoming more sensitive to criticism, more suspicious of dissent, more emboldened in monitoring opponents. We cannot pretend that a mandatory tracking app here would remain confined to catching phone thieves. It would be weaponised faster than it could be coded.
So yes, the questions are grand for the world: safety or privacy, crime-free streets or personal sanctity, technological efficiency or democratic restraint. But our real question is much smaller, much more urgent and far more depressing: if the government here installs a surveillance system tomorrow, can we trust the people who will run it?
Because ultimately, tech itself is neutral. It is the hands holding it that matter.
The writer is a lawyer.