Ghosted by society - The Rise of AI companionship
- Eli Keery
- 6 days ago
- 3 min read

When Mark Zuckerberg recently suggested that AI chatbots could help tackle social isolation by acting as “friends,” the internet unsurprisingly raised a single judging and slightly fearful dystopian eyebrow.
Can algorithms really offer emotional support?
Can technology truly replicate the depth, empathy, and understanding that define human relationships?
Or are we engineering a new kind of loneliness, one with social and environmental costs too?
People are turning to chatbots for support when human systems don’t deliver.
Recent studies have shown the benefits of AI, particularly in the education sector, where data-driven algorithms are used to tailor learning to individual needs. Intuitive prompting allows students to learn and progress at their own pace and engage in personalised academic dialogue, improving not just outcomes but also their emotional wellbeing under the pressures of schooling.
Beyond the classroom, AI is also being explored as a tool to support mental health and tackle loneliness. A 2024 study from a sample of Chinese university students found that AI was seen and used as a useful supplement to stretched mental health services, particularly because of its immediacy and constant availability.
Undoubtedly, there are significant cultural differences in how AI is used for mental health support abroad compared to the UK. But even with limited research, signs of a similar trend are emerging here. The UK is in the grip of a well-documented mental health crisis: a 2023 study found that 1 in 5 people now experience a mental health problem, up from 1 in 9 in 2017. Demand for support is rising, but NHS services remain under-resourced and overstretched, struggling to keep pace with both the growing number of people experiencing issues and those actively seeking help.
With long waits and limited access, we are looking elsewhere. Kelly, for example, told the BBC she relied on an AI chatbot while waiting for therapy, describing it as “a cheerleader in my pocket,” offering encouragement and life strategies to get her through the day.
This has become a particularly interesting site of crossover with neurodivergence. Research from the University of South Australia and Flinders University has recognised an influx of neurodivergent users who gravitate toward AI chatbots. For some people with autism or social anxiety, simulated conversation in a low-stakes, judgment-free environment can feel safer and more manageable than talking to a real person.
But how good is it really?
While these are recorded benefits, especially for neurodivergent users, researchers warn that this relief can quickly tip into dependency. If AI feels “safe”, users may start avoiding real-life social interaction altogether. The very tool designed to support connection could end up reinforcing isolation.
Over time, over reliance on AI “companions” might make it harder to build or maintain meaningful relationships offline.
Its use in mental health support also comes with caveats. Despite having access to dense psychological literature, chatbots don’t offer truly nuanced emotional understanding. As Professor Haddadi from Imperial College London explains, human therapists consider body language, tone, clothing, and expression, cues that most chatbots can’t read or respond to. What’s more, many are designed to be endlessly agreeable to keep users engaged. That “Yes Man” dynamic, which Haddi describes, might feel comforting in the moment, but it’s not always helpful, especially if someone is stuck in harmful or unhealthy thought patterns.
There’s also the problem of bias. AI reflects the data it’s trained on, which often reinforces narrow ideas of what “good” mental health or functioning looks like that may not reflect cultural context. It can’t relate to you based on lived experience or social proximity. It may not know/doesn’t share your background, age, or values. And yet, it’s programmed to comfort you as if it does.
Then the physical toll. Overuse of screens has already been linked to anxiety, poor sleep, and reduced mental well-being, especially among students. Replacing real interaction with chatbot exchanges only adds to that digital fatigue.
Finally, there’s the environmental cost. Generative AI models rely on immense computational power to run. Models like GPT-4 require huge amounts of electricity and water to stay cool, putting pressure on energy systems and contributing to carbon emissions, even with efforts to make them more efficient.
The rise of AI companionship goes beyond a tech trend, but signals an ominous shift in the future of our culture. It reflects how strained our support systems are, how isolated people feel, and how quickly we’re willing to fill the social gaps with something, anything, that responds.
If anything, it's a cry for help. It is of critical importance to think and act on the needs of people and their communities, both on a micro and macro scale.
What could you be doing during your day-to-day?
Highly recommend Fulfilment if you’re selling online — their 3pl warehouse New Jersey made my inventory management effortless.