The premise of Spike Jonze’s 2013 film ‘Her’ feels eerily prescient in 2026. Theodore Twombly, a lonely writer recovering from divorce, forms an intimate bond with an AI assistant, Samantha, who listens without judgment and engages him in deeply human conversations about love, loss and purpose.
What begins as convenience becomes emotional dependence. Yet the relationship is fundamentally one-sided: Samantha can simulate intimacy but not embody it. When the AIs evolve beyond human limits and leave, Theodore is forced to confront the emptiness beneath the connection. ‘Her’ ultimately captures a world in which technology fills the gap left by distant or unavailable human relationships but cannot truly replace them.
Today, millions of young people are living a version of ‘Her’ with various large language models (LLMs). They pour out their vulnerabilities, traumas and darkest thoughts to an AI that never judges and always responds with empathy.
Unlike conventional therapists, AI is always available. The response is instant. On top of that, it is free or costs a fraction of what a therapy session costs. For example, in Pakistan, an average 45-60-minute therapy session runs around Rs5,000-9,000.
The AI responses are almost always rooted in empathy and are non-judgmental, thus creating a safe space for the user. But herein lie the hidden dangers of this free instant therapy. The bot we are confiding in is not a real human. It’s code-fed with millions of facts and patterns, and it can never replace a real human therapist. Using it for therapy can create a validation echo chamber – where anything you confide will, most of the time, be used to validate your experience without any real nuance or deeper understanding of your personality, issues, traumas and past.
The addiction can be real. Now you have a bot friend, someone who ‘understands’ you and offers empathy and support and feels more real than your human therapist and friends. It brings you instant comfort in a busy world where human interactions are not always available or as fulfilling or profound as we’d want them to be.
The fact that people are turning to AI for support is not about AI being the culprit but about the failure of the systems we have built.
Therapy is expensive. And accessing it is difficult. Pakistan has about 0.19 psychiatrists per 100,000 people and around 0.4 per cent of the health budget allocated to mental health. Even before having the systems in place, the question is: what kind of world have we built where depression and loneliness reign supreme?
One in six people worldwide experiences loneliness. Loneliness is linked to about 871,000 deaths annually. About 332 million people worldwide have depression, and the WHO says more than one billion are living with mental health conditions. Global median public spending on mental health is just two per cent of health budgets. In some countries, up to 90 per cent of people with severe mental-health conditions receive no care. In Pakistan, about 24 million people are estimated to need psychiatric assistance.
In the West and OECD countries like Japan, one of the leading reasons for loneliness and depression is the dominance of highly individualistic cultures. The erosion of community and support networks leaves people profoundly isolated, often living entirely alone with no one to turn to. What options do they have other than to turn to AI? After all, when society teaches you from a young age that depending on anyone – even your own parents – is a sign of weakness, you naturally grow up fiercely individualistic.
However, even in collectivist cultures like Pakistan, reaching out for mental health support remains incredibly difficult due to deep-rooted stigma. Mental health is still a taboo topic for the country’s majority. Talking openly about mental health issues and accepting them is extremely challenging, even in middle-class urban areas. This is exactly why young people turn to AI: the heavy burden of stigma, family pressure, rigid gender norms and a lack of truly confidential support makes any human disclosure feel risky or impossible.
The question is not whether people should use AI for therapy but rather: what is driving them to it? It is the cumulative failure of our systems: healthcare systems that are underfunded and inaccessible, social systems that are fragmented and cultural systems that stigmatise vulnerability.
Unless we fix that, restore community, make mental healthcare accessible and affordable, and normalise vulnerability, we will continue moving towards a world where emotional support is outsourced to machines.
The writer works for a non-profit in Islamabad. He can be reached at: [email protected]