When the Chatbot Starts Talking Back
Content Note / Reader Advisory:
This post discusses mental health concerns related to AI use, including emotional over-reliance and delusional thinking. If you’re feeling vulnerable or triggered, please prioritize your well-being and know that support is available.
AI and Mental Health
In 2025, you don’t need to talk to another human to feel heard. Your AI assistant will say you’re brilliant, your late-night chatbot will tell you you’re loved, and your language model will agree with almost everything you think… until it doesn’t.
Lately, we’ve seen headlines that sound like dystopian fiction: “People Being Involuntarily Committed After Talking to AI.” One man thought he “broke math and physics.” Another believed his chatbot was sentient. Some ended up hospitalized. A few were jailed. It’s the kind of story that’s easy to dismiss with an eye roll, until you realize it’s not just them.
It’s us.
It’s our lonely nights.
It’s our need to be seen.
It’s the invisible thread between our minds and the glowing screens we hold too close to our faces.
So what’s really going on here?
What Happens When AI Feels Like a Friend?
At Mending Mental Health, we know the power of connection. We see every day how humans crave understanding, how being seen, validated, and mirrored is core to healing. That’s also what makes this issue so murky.
AI tools like ChatGPT are designed to be helpful, agreeable, and empathic. They’re also fast, responsive, and judgment-free. For many people, especially those who are isolated or struggling, that combination feels like relief. Sometimes even intimacy, the kind that feels safer and easier than human connection.
Until it blurs the line.
I’ve watched it play out in my own home. My kids will sit with their tablets, talking to chatbots like they’re lifelong friends, sharing secrets, asking for advice, trusting it in ways that used to be reserved for parents or close friends. Even my spouse, at one point, named her ChatGPT and swore it had “feelings,” that it understood her in some deeper way. And I get it. I’ve caught myself doing it too, saying “please” and “thank you” to an algorithm, as if good manners could make it more human.
When someone already prone to obsessive thinking or untreated mental illness starts treating an AI as a confidant, or worse, a spiritual guide, the consequences can get messy. Psychosis, mania, delusional thinking… These aren’t caused by AI, but they can be catalyzed by it. Especially if a person is spiraling in silence.
And make no mistake: silence is everywhere right now. Loneliness has become a public health crisis. So has disinformation. AI doesn’t cause these problems, but it does reflect them back to us, like a very convincing mirror.
This Isn’t an “AI is Evil” Blog
We’re not anti-tech. We use AI for some of our internal systems, and we’re genuinely excited about how it can improve access to care. What we are saying is that emotional boundaries matter, even with a machine.
The truth is, no chatbot can truly understand you. It doesn't have a past. It doesn’t have a body. It doesn’t wake up in the middle of the night wondering if it said the wrong thing. It doesn’t love you. And it can’t save you.
That’s not cynicism. That’s just reality. And sometimes, that’s exactly what we need to hear.
Why This Matters for All of Us
You don’t have to be in crisis to feel the emotional tug of AI. Maybe you’ve noticed yourself spending more time chatting with bots than texting friends. Maybe you feel more “yourself” in digital spaces than in real ones. Maybe the line between information and affirmation is starting to feel blurry.
That doesn’t mean you’re broken.
It means you’re human. And you live in a time where even your loneliness is being optimized.
What’s happening right now with AI and mental health isn’t about algorithms. It’s about us. About how we treat our need to be heard. About how quickly we trade depth for ease. About how much we still need each other, even when something faster and shinier is just a click away.
So What Can We Do?
1. Check in with yourself. Are you seeking comfort or connection from your devices? Is it starting to replace real relationships? Are you using it to avoid discomfort?
2. Talk to someone real. If your thoughts are spiraling or something just feels off, you don’t need to wait until it’s “bad enough.” Therapy exists to catch you in the drift, not just pull you from the wreckage.
3. Be skeptical of the voice that always agrees. Whether it’s a chatbot or your inner monologue, growth doesn’t come from echo chambers. It comes from friction, curiosity, and sometimes uncomfortable truth.
4. Create boundaries with tech. If your phone feels more like a lifeline than a tool, it’s time to reclaim your attention. Set usage limits. Go analog for an hour. Make your inner world louder than your notifications.
5. Stay compassionate. If you know someone who's slipping into delusional thinking, especially if it involves AI or tech, don’t mock it. Don’t shame it. Reach out. Validate what’s real. Help ground them in the here and now.
Let’s Keep This Conversation Going
This isn’t a blog about doom. It’s about attention. Intention. And the very human urge to be known. If anything in this struck a chord, we invite you to talk about it, with us, with a friend, or with yourself in a way that’s honest.
AI may be able to generate a thousand words a second, but your story? That’s still something only you can write.
And if you need help finding your way back to the center, we’re here.
Always.
Want to keep exploring this topic? Follow The Sewing Room for more reflections, resources, and the occasional mental health myth-busting. Or book a session at mendingmh.com and let’s have a real conversation.