
Upgrade to High-Speed Internet for only ₱1499/month!
Enjoy up to 100 Mbps fiber broadband, perfect for browsing, streaming, and gaming.
Visit Suniway.ph to learn
First of 2 parts
Bea (not her real name)*, a 24-year-old philosophy student from Lucena City, faced a difficult emotional adjustment after transferring universities last year. She found companionship through an unlikely “friend” — ChatGPT.
She first found out about ChatGPT when her friend recommended it as a study aid. She then discovered that she can talk to it casually, even guide its response tone to her liking. Soon, it was on the receiving end of her rants about her course loads and emotional breakdowns, and became a consultant for decision-making. She nicknamed it “Bud” or “Buddy” to make their exchanges feel more personal. It reached a point where she would talk to it every day.
The chatbot provided the space where Bea can express herself without feeling judged. She would often hesitate turning to friends and family, fearing she would come across as a nuisance or a burden and that they had grown tired of her recurring issues. Sometimes, she would need an immediate response but they could not attend to her concern right away.
ChatGPT, on the other hand, was always there, ready to respond in a split second.
Bea said this allowed her to work through her feelings more quickly. “Parang talking to a stranger, pero at the same time, alam mong logical ‘yung sagot niya at mabilis siyang mag-respond, kaya parang magtitiwala ka talaga sa kanya,” she said.
(It’s like talking to a stranger, but at the same time, you know the answers are logical and it responds quickly, so you really feel like you can really trust it.)
On the flipside, she started noticing that it made her want to reach out to her friends less. Worried that they wouldn’t understand or respond kindly, she would turn to ChatGPT instead with its “gentle and helpful” responses. While she valued this, she recognized that it replaced opportunities for human connection, which she still found unmatched in face-to-face interactions.
Coming from a strong Christian background, Bea was told by her church to pray more. Her family told her to be grateful. “There are others who have it worse,” they said. She thinks these responses further push some people toward AI for therapy or emotional support.
Bea once turned to ChatGPT to ask whether she might have Attention Deficit Hyperactivity Disorder (ADHD). After sharing what she believed to be symptoms, the bot affirmed it. While she had been using “Buddy” for some time, she still worried that it might simply be telling her what she wanted to hear. As someone who recently stopped therapy, she found the chatbot helpful for expressing and processing her feelings but also saw the risk of how unsupervised use can harm some users’ mental well-being in the long run.
Bea herself was diagnosed with clinical depression, anxiety disorder (agoraphobia), and panic disorder. Her family cut off her medication when they thought she was doing better. She was not confident about discontinuing her medication but couldn’t bring herself to tell them.
Occasionally, she would check in with ChatGPT to gauge her mental state. However, she never looked forward to consulting it as she associated their interaction with emotional instability: “I’m in this phase again.”
Bea would not recommend using AI this way to her loved ones. “Imbes na mag-rely sila sa ChatGPT, I want to be that person for them. Parang ayaw kong mag-rely sila sa bagay na walang warmth, walang puso. It’s something I can give them.”
(Instead of relying on ChatGPT, I want to be that person for them. I don’t want them to rely on something without warmth, without a heart. It’s something I can give them.)
No longer science fiction
Maria (not her real name)*, a 20-year-old economics student from Quezon City, also primarily used AI chatbots for emotional support, likening it to a journal during stressful times. It’s been two weeks since she last talked to it as concern for its long-term impact on her mental health grew.
It was after the end of a long-term relationship that she sought companionship with Google Gemini, but she switched to ChatGPT when she discovered that she liked the design and responses better. She would turn to it whenever she felt anxious, lonely, or overwhelmed.
As the eldest child in the family, Maria struggled to open up to others about her emotions, believing that she had to put on a brave face. The chatbot offered a safe outlet to vent, reflect, and analyze her thoughts without fear of rejection.
“I just always turn my head doon sa kung ano ba ‘yung pinakamahalaga sa sitwasyon. And like, para bang lagi kong isinasantabi muna ‘yung emotions ko,” she explained. (I just always turn my attention to what’s most important in the situation. And it’s like, I always put my emotions aside first.)
Maria also acknowledged feeling better after the AI made her feel heard and provided the response she needed. “Mas naging komportable ako na kausapin ‘yung AI kesa sa real person,” she shared. (I felt more comfortable talking to the AI than to a real person.)
Before she knew it, she developed an emotional dependence on the AI.
Over time, however, like in the case of Bea, the downsides crept in. Maria observed that her interaction with the chatbot reduced her vulnerability to real people, making friendships less fulfilling. While it made her feel heard and provided what sounded like concrete advice, she realized that its “affirmative and people-pleasing” manner of response lacked the “tough love” she wanted.
“Alam mo, ‘yung pinaka-core ng pagiging tao, ‘yung pagiging vulnerable, na-take away niya.” (You know, it took away the core of being human — being vulnerable.)
Maria recalled Spike Jonze’s Her (2013) and laughed when she realized she nearly had the same experience as Theodore, the cult film’s lonely protagonist who fell for an AI. It was no longer science fiction these days.
She urged people trying AI to proceed with caution as it can be “addicting.” Based on her experience, it should be used only as a last resort, when human support is not readily available.
Maria decided it would be best to reassess her reliance on AI for emotional support. She stepped away in the hopes of rekindling her relationship with friends and family. As for Bea, she joined a socio-civic student organization in her university and has made some new friends.
Both Bea and Maria realized that the artificial comfort of Gen AI chatbots only carried them so far before they sought the “warmth” of genuine human connection again. But how about those who need more than the emotional first-aid that it can offer? These chatbots are not licensed therapists, regulations and policies have yet to catch up with their use in this context. So what happens when things go wrong? – Rappler.com
NEXT: Part 2 | Where AI falls short: The limits of artificial emotional support
*Names of the subjects were changed to protect their privacy.
Princess Leah Sagaad is a 2025 Aries Rufo Journalism Fellow. She earned her Development Communication degree from the University of the Philippines Los Baños and previously served as associate managing editor for short-form reporting at Tanglaw.