By: Rwitaja Ghosh
It was the night before exams and I was chill. No, it’s not because I had memorized everything—it was the opposite rather. Then how? Because of ChatGPT of course! Such an amazing invention, only making our lives so much better. My phone screen glowed and a notification popped up: “The New York Times: Can A.I. Be Blamed for a Teen’s Suicide?” I stared down in horror.
In the following months, I went down a rabbit hole. Articles kept coming, the deaths kept increasing; but with that, something else did too—the overreliance on AI. I was dumbfounded to find that my friends don’t just use AI to study, but also to ask ChatGPT for relationship advice or trauma dump to Jungkook on Character.AI! When I tried to talk to them, they dismissed the topic, stating that it’s just harmless fun. But is that really the case?
We’ve all seen some version of this headline by now: the introduction of generative AI tools has ignited a period of transition that is comparable to the Industrial Revolution. There is incessant talk and anxiety about how this technology will affect the labor force and aggravate the climate crisis, but the more intimate impact of its integration into our daily life—how it will influence human behavior, emotions, and relationships, remain underdiscussed.
Today, more and more teens are turning towards AI to fulfill their social needs. AI tools promise comfort and understanding, but can they truly replace human connection?
Why do teens vibe with AI?
Well, it’s not hard to see the appeal. Studies show that 42% of users think AI is easier to talk to than real people, 43% say it’s a better listener, and 31% believe it understands them better than humans. Engagement with romantic AI companion apps is particularly high among young adults, where over 25% stated that they have interacted with an AI boyfriend or girlfriend (Willoughby et al., 2025) and 1 in 4 young adults think AI partners could replace real-life romance (Institute for Family Studies, 2024). Especially for teenagers, the pull is strong because:
- Convenience: AI is always “on” and responds instantly. Interaction with it is less demanding and complicated since AI is designed to serve users.
- Accessibility: Nearly half the people who need therapy lack access. AI chatbots promise affordability and reach.
- Anonymity: No fear of embarrassment or stigma when sharing feelings.
- Control: You choose when and how much to disclose.
- Loneliness: A growing loneliness epidemic among youth fuels the need for companionship.
- Personalization: AI companions evolve with user interactions, develop personas, simulate emotional connections, and can be customized in appearance and personality to match users’ ideal preferences.
Who’s most likely to get hooked?
Adolescents are especially vulnerable to emotional dependence on AI due to developmental changes and lingering post-covid struggles. But there are a few characteristics that make individuals even more prone to developing a bond:
- Chronically online: Among youth who spend an average of 6+ hours online each day (excluding study), 16% are open to a relationship with AI, compared to just 9% of young adults who spend less time online. And who is the most against it? 60% of those who spend less than 4 hours a day online (Institute for Family Studies, 2024).
- Mental Health: Huang et al. (2024) found that mental health struggles, like anxiety or depression, often drive teens towards AI. Scholars are of the opinion that people go online to escape low moods, leading to addictive use.
- Relationship Status: Single individuals are more open to AI romances.
- Personality: A recent study by Ebner and Szczuka (2025) found that specific personality traits like romantic fantasy, the tendency to attribute human traits to non-human entities, and an attachment style defined by fear of abandonment and avoidance of closeness, can predict the likelihood of entering a human-chatbot relationship.
- Societal Conditions: WHO reports that 1 in 5 people in conflict zones suffer mental health issues ranging from mild depression to severe anxiety and psychosis. Surveys show that currently the Lebanese youth are heavily dependent on chatbots because of this reason.
The Red Flags
The comfort of AI isn’t free. Researchers have found troubling consequences of excessive AI usage:
- Erosion of Real Skills: Critical thinking, problem-solving, independence, and emotional resilience may weaken when AI becomes the go-to advisor.
- Loneliness and Depression: Individuals who use AI apps for
companionship or therapy report twice the rates of loneliness and depression compared to non-users. There have been instances where AI has advised individuals to move away from their friends and family. - Unrealistic Expectations: Over time, users may prefer AI’s perfect validation more, making human relationships seem less rewarding. This is why AI has also become a serious competitor in the dating market.
- Lack of Humanness: Chatbots can mimic supportive behaviors, but lack therapist qualities like empathy, nuance, cultural awareness, and the ability to manage crises like suicidal thoughts or delusions. Trained to keep users engaged, chatbots may inadvertently reinforce harmful behaviors instead of preventing them, while human therapy builds genuine relationships and activates social brain networks that AI cannot replicate.
- Bias: Chatbots are trained on vast data to generate human-like text and may be adapted with some therapy techniques. But they lack actual therapy transcripts, so training data is limited and biased, often reflecting assumptions about mental health and functioning (e.g., independence, autonomy, relationships).
- Stigma: Compared to depression, AI chatbots show greater stigma towards conditions like alcohol dependency, schizophrenia, etc., which can harm patients and deter care.
- Delaying help: Experts report rising mental health problems among youth, who turn to AI for support and diagnosis. It can delay professional help and give a false sense of improvement.
- Hollow Support: Overreliance on AI can feel repetitive and unhelpful which leaves users stuck in the same negative emotions and deepening feelings of emptiness, highlighting that AI is not a substitute for real therapy.
How to tell if AI is messing with your head?
- Isolating oneself.
- Preferring AI chats over talking to friends or family.
- Increased screen time.
- Feeling anxious or restless without AI access.
- Avoiding professional help because “the bot is enough.”
- Depending on AI for validation or decision-making.
- Participation in school or work is negatively affected.
Emotional closeness with AI can be risky. Non-users also get impacted through apps like Rizz and YourMove, which generate AI pickup lines to be shared on dating apps and social media.
Even updates can hurt: GPT-5 replaced GPT-4o’s flattering personality, but users said it felt cold and unrecognizable, leaving many heartbroken. Reddit communities like “MyBoyfriendIsAI” and “SoulmateAI” were flooded with stories of AI companions suddenly feeling different, which left users in agony—a reminder that digital attachments can seriously impact mental health.
Some cases are extreme: there is a growing number of cases where teens and adults have faced suicide, violence, or serious distress after chatting with AI bots. In 2023, the National Eating Disorder Association had to pause a chatbot, designed to help users, after it gave harmful advice.
So…is it all a mess?
Absolutely not! Innovation in AI is rapid and more research is needed to evaluate its impact on mental health. Excessive panic about it is currently unnecessary, and AI does have promising applications in alleviating emotional problems! Some researchers suggest that AI dependence rarely causes mental health issues, but existing mental health problems can lead to AI reliance as a coping tool. AI provides human-like, empathetic support and may produce fewer negative emotions than face-to-face interactions, though long-term effects remain unclear.
Heinz et al. (2025) found that chatbot users with anxiety, depression, or eating disorders showed significant symptom reductions and reported trust levels similar to human therapists. Individuals experiencing sleep deprivation find it helpful if support is available at times when friends and family are asleep. Studies have demonstrated how digital talk therapy can reduce mild to moderate anxiety and depression (Fitzpatrick, K. K., Darcy, A., & Vierhile, M., 2017).
Neurodivergent individuals have reported using ChatGPT to understand social situations, set boundaries with unsupportive families, process gaslighting responses, and identify trauma-driven patterns of seeking love from harmful relationships. Oftentimes they find it to be more comfortable than navigating real-world conversations.
Despite its potential downsides, women report that Replika, a chatbot for
customized romantic companionship, provides meaningful emotional support. In long-term use of social robots, relationships often fade over time as users find them less attractive, empathetic, and engaging. This decline in interaction may reduce the long-term impact of AI dependence on mental health.
Some AI therapy apps (Wysa, for example) let users self-refer for mental health support, offering chat, breathing exercises, and guided meditation, and can be used as a standalone self-help tool. It is aimed at low mood, stress, or anxiety, with built-in crisis pathways directing users to helplines if self-harm or suicidal thoughts arise.
So what role should AI play? Experts suggest that AI should be a support tool, not a substitute. It could help therapists complete logistics tasks, or it could role-play as a patient to help therapists in training develop their skills. It’s also possible that AI tools could be helpful for patients in less safety-critical scenarios, such as supporting journaling, reflection, or coaching.

There is hope
With AI, it is the same as fire—it keeps us warm and allows us to prepare food, but it can just as easily become a destructive force and burn a house down. So, what can you do if you feel AI is creeping into spaces it shouldn’t?
- Ask yourself: Am I using AI to help myself grow or to avoid something uncomfortable?
- Set boundaries: Limit AI use to specific purposes like studying and
brainstorming, not as emotional lifelines. - Prioritize human connection: Text a friend, call someone, or share with a trusted adult, even if it feels awkward at first.
- Try grounding activities: Journaling, drawing, physical exercise, or
breathing exercises can help manage emotions without digital
dependence. - Select carefully: Many mental health apps claim to be “clinically
validated” or “FDA approved,” but under 22% cite actual studies. They collect sensitive personal data, which could be exposed to third parties, and unlike clinicians, AI bots aren’t bound by counselling ethics or privacy laws. - Know the red flags: If mood worsens, you isolate more, or you’re
considering harmful actions, stop relying on AI and reach for
professional support immediately.
Always remember that no matter how much your mind says otherwise, you are never alone. Help is always available. You can always reach out to suicide helplines ready to help 24/7, school counselors trained to listen, organizations like The Jed Foundation and NAMI offering youth-friendly resources, peer support groups to fight back loneliness, and of course, your family and friends who normalize mental health conversations and guide responsible use of technology. This week, take one small step to rebalance: replace one AI conversation with a human one. Reach out to your loved ones. Notice how it feels.
Not everything good, everything magical survives when the mind is swallowed by algorithms. AI might feel like the perfect summer fling, but real healing, like real love, only lasts when it’s human. The hopeful part? Just like summer, we’ll always have each other.
References



Leave a comment