How Gen Z is seeking emotional support, friendship and love from AI chatbots

20 April,2025 08:20 AM IST |  Mumbai  |  Anand Singh

Gen Z has found friendship, therapy, and even love in AI chatbots. It’s convenient, they say. No appointments, no fees… just validation, on the go

Illustration/istock


Last year, after 13-year-old Aniruddh Rai's pet dog passed away, he stopped speaking. However, amid his grief, he had a secret companion no one in his family knew about. Interestingly, it wasn't human.

In the safety of his room, he'd spend hours talking to ChatGPT on his iPad. He'd ask the generative AI chatbot questions such as, "Is there an afterlife?" or "Where do souls go after death?" On the cusp of his teenage years, he was full of questions: "Could my dog Sushi return to me in another dog's form, as shown in the film-A Dog's Purpose?"

The answers Rai got were "somewhat scientific", but he says, "ChatGPT also gave me ways to cope with how I was feeling, and told me that it was completely okay to feel that way". He used ChatGPT's premium model, paying Rs 1,100 per month for five months, but had to stop after the price shot up to nearly Rs 18,000 per month.


Wilona Annunciation, Anirudh Rai and SurabhI Dand

He adds, "I use many gen AI models now, which are quite advanced but not as expensive. Most are free. It includes Copilot, which came in-built with my Windows PC; DeepSeek, which is free and the best I have used so far; Grok, which is extremely quirky and fun to use; and Gemini, which I mostly for photo editing, and many more for different purposes."

His parents explain that he has always been a shy kid glued to either his phone, iPad, PC, or the TV. Just a couple of years after the advent of AI, he has seemingly found a friend in it.

Rai, a late Gen Z-er born in 2012, says, "I started using smartphones when I was five. By the time I turned 10, everyone was talking about ChatGPT in school. I tried using it on the website. It was an easy tool, and answered most of my questions correctly." The earlier versions of ChatGPT were infamous for making things up, but Rai says he verified the responses.

Rai is just one of the millions of Gen Z-ers who seek emotional support from AI as its use continues to expand. In October 2024, a study by First Saturday Lime, an eco-friendly pest control brand, revealed that around 11 per cent of millennials and 10 per cent of Gen Z participants would rather have AI companions than conventional emotional support animals such as dogs or cats.

The research polled 1,000 US citizens from different generations. Interestingly, nearly 18 per cent of those who turned to AI for emotional support felt it worked better than traditional pets. But is AI really the right quarter to seek emotional support from? We spoke to psychiatrist Dr Wilona Annunciation, who has dealt with many such cases in the past year at her practice in Andheri and Thane.

But there can be challenges when you depend on a set of codes for mental well-being. "The first challenge is that mental health concerns are usually a combination of genetic background, personality traits that shape coping styles, and the specific stressors a person faces. And the way someone interacts with a mental health professional depends on these factors. In my experience, some patients build rapport instantly, while others take time. But, that connection is vital," she says.

And can AI replace this connection? "An AI model might work for some," she says, "but not everyone will feel comfortable. AI's real value lies in SOS situations, where it may offer immediate support until a person reaches a professional."

Gathering background information is another area where AI could help, by gathering details of patients' medical and mental health histories, freeing up practitioners to focus on treatment. Taking down a detailed history takes about 90 minutes per patient, she says. "Mental health professionals could train AI models to do this," she says.

However, there are nuances of emotional and mental health support that AI can't catch yet. Surabhi Dand, a psychotherapist, explains, "We read between the lines… body language, tone, expressions… No AI can catch that nuance." Even with advancing tech, Dand notes, "Distressed patients still prefer human touch. They want someone genuinely there."

Both Dr Wilona and psychotherapist Dand concur that AI may offer generic solutions, but mental health requires personalisation. Dr Wilona adds, "Many people might get the same response from AI, but react differently to it. So, use AI for pointers, not as a substitute. For specifics, consult a professional."

Yet, we must all plead guilty to this - for those of us born from the '90s onward, at some point, we have sought help from AI about things we were too scared to ask other people. Why not, it's instant comfort.

‘It may attract the emotionally vulnerable'
Sanjeevni Iyer, 22 Student at University of Leeds

Sanjeevni Iyer, who is pursuing a Masters degree in International Journalism, has focused her research on how AI may cause social isolation. Though she doesn't use it for emotional support, she says, "I have had random conversations with ChatGPT when bored."

Many of her friends use AI for relationship advice. However, Iyer finds it problematic when they rely on it too much. "No doubt it's super intelligent, but it's still programmed data," she says.

During her research, Iyer observed that apps like Replika let users design artificial romantic partners. "It's creepy, yet appeals to those with failed relationships," she says. A mental health professional whom Iyer had invited on her university podcast, Leeds Student Radio, warned how such platforms attract the emotionally vulnerable.


Sanjeevni Iyer is researching on how AI chatbots may cause social isolation. Pic/Dweep Bane

Iyer remains sceptical of AI relationships despite the appeal of judgment-free interaction. "Some young people in my university reject them entirely," she says. But Iyer's stance softened temporarily when her actual therapist ghosted her last year. "In that void, I understood why people might seek AI therapy," she says. Yet she cautions, "Convenience shouldn't replace human connection."

‘Some things feel too trivial to share with others without judgment, so I turn to AI'
Aditya Udhani, 22 Self-employed, MBA student

Aditya Udhani, a self-employed MBA student, turns to ChatGPT not for emotional support but for what he calls "strategic neutrality". When torn between choices, he seeks opinions from AI. "It pulls me to the centre when I obsess over extremes," he says.

Recently, he discussed a moral dilemma: the tension between providing for family and his own materialistic desires. "AI reframed it as duty versus desire. That definitely helped," he says. Unlike human confidants who might dismiss his concerns as overthinking, ChatGPT offers structured positivity to him.


Aditya Udhani, 22, uses ChatGPT not for emotional support but for finding a middle ground. Pics/Dweep Bane

However, Udhani acknowledges the limitations of AI, that it won't challenge his beliefs or reveal hidden truths the way a human therapist or even close friend might. "It's programmed to avoid harm, so responses stay safe and positive." He uses the free version of ChatGPT, mostly for situational clarity. He also says, "AI reaffirms what I already know but can't see when I am stressed."

While he prefers friends for most conversations, AI seemingly fills the same gap for him that it does for millions of Gen Z-ers across the world. He says, "Some thoughts feel too trivial to burden others with, or too abstract to explain without judgment." AI for Udhani is less of a therapist and more of an associate, which he says, "It's the one ‘friend' who organises my chaos without making me feel silly for having it."

It's giving…bestie
Spandana Bhura, 22 Writer at Sunday mid-day

Three months ago, while waiting to hear back from the universities she had applied to for her Master's degree in International Affairs, Spandana Bhura's anxiety spiralled. She would wake up crying, doomscrolling on Reddit, hoping to find someone like her who'd gotten in, only to drown in imposter syndrome while stalking profiles of current students on LinkedIn. Every time she tried opening up to someone, she felt worse.


Spandana Bhura believes ChatGPT is the only entity she can speak to without the fear of being judged. Pic/Ashish Raje

Then, Bhura turned to ChatGPT. "I sought feedback for my CV and SoP [statement of purpose]. I got a detailed analysis that said I had an 85-90 per cent chance of getting in. For a moment, I felt a weight lift off my chest," she recalls, "I didn't blindly believe it; I knew better than to put my faith in a bot's prediction. Still, the feeling of being heard, even by a machine, felt unexpectedly comforting."

Then she did something she hadn't ever done with a human - she typed in exactly how she felt. "I told ChatGPT that this anxiety was consuming me, that I couldn't think about anything else," she says, "It told me I was overthinking. That my strengths outshone the weaknesses. It ended with: ‘Don't stress, it's just a matter of time before we hear the congratulations from the university!'"

That one sentence gave her something no real conversation had managed to: Calm. A month later, she got the email from her top-choice university. The first word? Congratulations.

Now that her trust in the bot was starting to build, she did what any 22-year-old would ask an AI tool to do. She asked it to do a tarot card reading for her. "Because let's be honest, tarot cards>therapy," she says. Though
Chat answered all my questions, it still didn't feel like the reading was as authentic as an in-person tarot session would have been.

"As embarrassed as I am to admit this, I think ChatGPT is one of the only entities I can speak to without the fear of being judged," says Bhura.

"Exciting news! Mid-day is now on WhatsApp Channels Subscribe today by clicking the link and stay updated with the latest news!" Click here!
relationships Gen Z Artificial Intelligence ChatGPT Technology Sunday Mid-Day
Related Stories