What AI Can (and Can’t) Offer When It Comes to Mental Health
Type “ChatGPT as a therapist” into your search bar and you’ll find countless Reddit threads and articles talking about how ChatGPT is “better than my therapist” or “research” backing the idea that ChatGPT is more empathetic than a human therapist. Sigh.
While I’m not doubting the people sharing their experiences using ChatGPT, I believe that it likely says much more about their previous experiences with therapists. If somebody has only worked with a therapist (or multiple therapists) who wasn’t a good fit for them, it would be no surprise that ChatGPT comes off as a relatively miraculous alternative.
It’s free. It’s available 24/7. It never fails to validate and hype you up. And it completely fails to foster the kind of relationship that therapy is built upon. ChatGPT and other AI agents thrive in sharing step-by-step information, brainstorming, and regurgitating information already available on the internet.
If that’s what you’re looking for, that’s great! Go to town with ChatGPT. But let’s not confuse that with actual therapy.
Good therapists help you peel back your many human layers in a way that AI isn’t able to do. They connect the dots between your past, present and future. And perhaps most of all, they use their own humanity and lived experiences as a foundation for empathy and understanding, and shine a spotlight on the patterns you might not be able to see on your own.
That said, it’s worth exploring what AI actually can and can’t offer, especially as more people start turning to tools like ChatGPT for emotional support. Let’s start with how these tools are designed to work (spoiler: it’s not as a therapist).
The Rise of AI in Mental Health Support
Large language models like ChatGPT are trained on massive datasets of human conversation, books, websites, and other written content. They’re designed to mimic the patterns of human language, generating responses that sound natural, supportive, and sometimes even wise. It’s easy to understand the appeal when those responses come in the exact moment you’re feeling overwhelmed, confused, or alone.
Those three sentences just above were written by ChatGPT, and that’s what it’s great at–sharing accessible (not arcane) knowledge with low stakes. If something in that description is wrong, it doesn’t really matter.
Companies have begun designing apps and chatbots that position themselves as AI therapists or companions (I won’t link to them here, but they’re not too hard to find). Some of these tools are branded as emotional support apps, offering CBT-style check-ins or mood tracking. Others lean into relationship-style engagement, giving users a constant companion to talk to.
Pros and Cons of Using ChatGPT as a Therapist
There are some clear benefits to using AI tools like ChatGPT. AI is accessible, responsive, and endlessly affirming (at times, perhaps too affirming). For some, it may be the first time they’ve felt consistently validated or emotionally seen, even if they intellectually know the tool isn’t a real person.
But those same strengths also reveal the tool’s limitations. AI can’t pick up on nuance. It doesn’t register your tone, your facial expressions, or the deeper emotional context behind your words. It’s been known to “hallucinate” or fabricate information.
I’m a big fan of the podcast Hard Fork, where the two hosts, technology journalists Kevin Roose and Case Newton, discuss AI all day long. Their take is that while AI is impressive in its language fluency, it can go off the rails and produce dangerously inaccurate or erratic content. They also emphasize the ethical risk when AI developers fail to build in safety measures for people in distress.
Used thoughtfully, ChatGPT can be a helpful stopgap. Used blindly, it can give people a false sense of connection or confidence, without the grounding that comes from real relational work.
In short, here are the pros and cons:
Pros | Cons |
Available 24/7 | Doesn’t understand nuance like tone, body language, or emotional context |
Free and doesn’t require insurance | Tends to offer overly flattering responses that reinforce external validation |
Can feel validating or comforting in moments of stress | Can produce dangerously inaccurate or fabricated content |
Helpful for people hesitant to talk to a real person | Has no memory or ability to build a long-term understanding of your patterns |
Provides a space to process when traditional therapy feels out of reach (or actually is out of reach) | Lacks emotional intelligence, accountability, and ethical responsibility |
Where ChatGPT Can Actually Shine
There is one space where ChatGPT and other AI tools can be genuinely helpful: coaching. Not therapy, but coaching. When you’re trying to brainstorm solutions, talk through a work dilemma, or make a pros-and-cons list about a decision (with the exception of the list above), AI can be a surprisingly useful thinking partner.
In this context, ChatGPT acts more like a sounding board than a healer. It can help you organize your thoughts, reflect back ideas, and suggest different angles to consider. It can be great at helping you clarify goals or break down concrete problems into manageable steps. That kind of structure can feel empowering when you’re stuck in indecision or overwhelmed by options.
What it can’t do is help you uncover why you’re stuck in those loops in the first place. It can’t access the unconscious material that shows up in your patterns, projections, and blind spots. It won’t gently (or firmly) point out the defense mechanisms you’re using. And it won’t challenge the stories you’ve been telling yourself for years about who you are and what you’re allowed to want.
When we confuse coaching with therapy, we run the risk of solving the wrong problem or mistaking surface-level relief for real change.
My Perspective as a Therapist
In general, I never discourage the individuals I work with from using a tool that could be helpful, as long as they understand the context and limitations. ChatGPT can offer a moment of relief or a sense of being heard. But it doesn’t pick up on your emotional resonance, your facial expressions, or how your past informs your present.
It also tends to say what you want to hear. This kind of “glazing” can boost your ego in the moment but may reinforce a reliance on external validation instead of helping you learn to trust yourself. And there have been cases where AI gives harmful advice, like recommending eating glue or rocks. If someone is having a mental health crisis, that kind of unpredictability is downright dangerous.
That said, finding a therapist who feels like a good fit can be incredibly hard. Add in scheduling, insurance issues, or preference for in-person vs. virtual, and it’s understandable why someone might default to what’s more available, especially if that option feels less risky emotionally.
I’ve also seen how AI can provide comfort, especially for young people who feel isolated or misunderstood. If you’re exploring your identity or dealing with something you think no one else would understand, the sympathetic tone of an AI response might feel grounding.
AI might sound like it gets you, but it doesn’t know you. Not the way a real person can. Therapy isn’t just about feeling heard. It’s about building something with another human. Trust, emotional connection, and insight can’t be automated.
Final Thoughts
If you’re using ChatGPT or another AI tool to support your mental health, that’s not inherently a bad thing. But I suggest you use it as a supplement, not a substitute.
Let it be a companion while you find a therapist, or a space to reflect between sessions. And always keep in mind what it is: a very sophisticated language pattern matcher, not a person who can hold you through real pain or growth.
If part of the reason you’re turning to AI is that therapy still feels a little too uncomfortable or unfamiliar, I get it. I work with lots of people who are hesitant about therapy and want something more direct, less fluffy, and actually effective. Learn more about how we work with people who are hesitant about therapy.
And if you’re someone who finds it easier to talk to an AI than to a person, there’s probably more to unpack there. We specialize in working with men who feel emotionally disconnected or unsure how to open up.
At the end of the day, emotional growth doesn’t happen in a vacuum—it happens in a relationship. AI might be a tool, but it can never be a witness.
(And that last part was written by ChatGPT. Kind of cheesy, but also, kind of true.)
FAQs
1. Can AI ever replace human therapists?
Without a crystal ball, the best we could say is not anytime soon. At least not in any meaningful or sustainable way. AI tools might be helpful for short-term reflection or journaling-like support, but part of the value of therapy is building a relationship with another human, not simply exchanging knowledge in a one-sided interaction.
2. Is it “dangerous” to use ChatGPT as a therapist?
It can be, especially if you’re in a crisis or following advice that hasn’t been vetted by professionals. If the advice you’re seeking is more cut and dry (maybe looking for coping tools, or ideas on how to talk to your boss), AI is likely fine.
3. What should I do if I feel more comfortable talking to a chatbot or other technology than a person?
That’s a valid feeling and one that we see becoming more and more common. It may mean you’ve felt judged or dismissed in the past. A good therapist can help rebuild trust and teach you how to communicate openly and safely with real people.
4. Can ChatGPT help me with anxiety or stress?
To a certain extent, yes. It might help you reflect or feel less alone. But it won’t help you understand the deeper patterns driving your anxiety or how to shift them in a sustainable way.