A young man sits alone on a bathroom floor at night, lit dimly by his phone screen. He looks anxious and vulnerable, clutching his phone tightly. On the screen is a message from an AI chatbot: “Take a deep breath. You’re safe.” The toilet is visible beside him, emphasising the isolation of the moment.
AI-generated image for representation

‘Hey ChatGPT, am I funny?’ AI has become Gen Z’s emotional outlet

No waiting rooms, no backstories, just straight, simple answers. Gen Z is finding therapy in AI, but can instant comfort come at the cost of real healing and data privacy?
Published on

“Am I funny?” Shreya once asked ChatGPT during a particularly stressful week at work.

To her surprise, the chatbot replied warmly: “Yes. You take things lightly, you want to explore, and you're open to constructive criticism.”

At 23, working as an associate auditor in Bengaluru, Shreya was feeling the weight of corporate life and looking for a space to process her thoughts—somewhere she wouldn’t feel judged or misunderstood.

Since then, she has been turning to ChatGPT quite often, not for reassurance but for reflection, asking questions like, “Do you think I'll be successful? Do you think I am a people pleaser? Do I not put myself first?”

She is not alone. Twenty-one-year-old Rakshita turns to ChatGPT every morning before college, seeking the same reassuring words her therapist gives her to manage the anxiety that strikes as she steps out of her house. And ChatGPT responds accordingly. It is there for her — instant, free, and always available.

Like Shreya and Rakshita, a growing number of young people are turning to Artificial Intelligence (AI) to seek mental health support and emotional advice. Drawn by its cost-effective nature, easy accessibility, and the space it offers to ‘vent it all out’, many now use AI chatbots as a daily coping tool. For some, AI’s 24/7 availability and capacity to listen without interruption have made it an emotional lifeline. 

However, mental health professionals are sounding the alarm. They have pointed out that while these tools may offer temporary relief, they also risk becoming crutches that delay real therapeutic intervention. There’s also concern about the safety of sharing deeply personal information with models that may not be as private or secure as users assume.

To understand the allure as well as the risks of this new form of “AI therapy,” TNM spoke to users, psychotherapists, and AI experts about what’s fuelling this trend and what we should be watching out for. We also posed a series of questions about anxiety and medication to ChatGPT to better understand this phenomenon. Here’s what we found out.

From teacher to therapist

AI chatbots were once a tool for solving math problems, defining words, or other such technicalities. But what began with queries like “What is the meaning of [a phrase]?” or “How do I solve [a task]?” has now quietly shifted to questions like: “I feel sad. Can you help me?”, “Can you say a few good things to me?”, or even “How to kill all optimism in the world?”

Loading content, please wait...
The News Minute
www.thenewsminute.com