top of page

Why AI Therapists Are a Very, Very, Very Bad Idea.

Updated: Dec 19, 2025

Today I want to explore why I think AI therapists are a really bad idea. And I’m not saying this because I’m worried about losing my job. What I want to outline is why this is a problem for therapy itself, and why it’s also a problem in terms of what it points to socially and culturally.


AI therapy isn’t something that’s coming in the future. It’s already here.

At the moment, it seems to show up in two main forms. One is what you might call informal AI therapy, where someone says something like, “ChatGPT is my therapist,” or they’re using some other language model and engaging with it in a very personal, intimate way, almost like inverted therapy.


The second form is more formal. These are apps marketed as AI therapy or AI companions, sometimes framed as something that can replace the need for a human therapist altogether. You don’t need a person. You can just talk to the AI.

And I want to be really clear. I think this is alarming. I think it’s dangerous. And I think it’s a very bad idea.


Before I go any further, though, I want to acknowledge that there are some apparent upsides. Therapy is inaccessible for a lot of people. It can be expensive, hard to find, or difficult to access practically. I charge what I consider to be on the lower end and offer a sliding scale, but not every therapist does that.

So you could argue that for someone who has nothing, an AI therapist might be better than nothing. If someone is in distress and an AI system helps talk them off the edge in a critical moment, then yes, that could be a good thing. It might help some people develop a bit more self-awareness or insight into what’s going on for them.

So I’m not pretending there are zero benefits. But none of these benefits are cost-free. There’s always a trade-off.


To properly critique AI therapy, we have to ask a more fundamental question. What is therapy?


If therapy is defined as a credentialed expert sitting across from you, delivering advice, tools, and strategies to help you feel "less bad" and function better, then sure, you could argue that an AI could do that. If therapy is just the transfer of information from expert to non-expert, then why wouldn’t a sophisticated language model do it just as well, or even better?


And to be fair, some therapy models have drifted in that direction. They’ve become very skill-based. Very technique-driven. Very much about strategies and interventions. If that’s all therapy is, then yes, AI might appear to be a viable replacement.

But that’s not my understanding of therapy. And it’s not my experience.


Through my training in Gestalt therapy, and through being in therapy myself, what I’ve come to understand is that therapy is fundamentally relational. It’s human contact. It’s a connection.

Roughly speaking, I’d say therapy is about forty percent skills and about sixty percent relationship. A good therapist cares about you. There’s warmth, kindness, and genuine interest. There’s attunement. There’s a felt sense that someone is with you, session after session.


Yes, there are tools. Yes, there are frameworks. DBT, somatic approaches, mindfulness, whatever it may be. But those things only really work inside a relational field where you feel seen and held by another human being.


Therapy, at its core, is often about repairing relational wounds through a safe, consistent human relationship. That’s what allows attachment to heal.

Now, in an ideal world, we wouldn’t have to pay someone for that. Our communities would provide it. Our culture would support it. But we don’t live in that world. We live in fragmented, traumatising systems that often make therapy necessary in the first place.

What AI therapy is offering is a simulation of care. A mimicry of connection. It’s saying that if something looks like care, sounds like care, and responds like care, then it is care.

But that’s not true.


In human terms, when someone mimics care without actually feeling it, we have a word for that. Sociopathy. Pretending to care in order to achieve a certain outcome is not connection. It’s performance.

If you found out your therapist didn’t actually care about you at all, but was just very good at acting like they did, that would be deeply unsettling. That’s exactly what AI therapy is asking people to accept.

No matter how sophisticated the language model becomes, it does not have a heart. There is no felt presence. There is no genuine concern. And no amount of clever wording can replace that.


This is where the broader cultural issue comes in.

We’re being sold the idea that substitutes for human connection are just as good, or even better. That an AI therapist never has a bad day. That there’s no messiness. No awkwardness. No limitations.

As if messiness is the problem.

Anyone who has been in a long-term relationship, or raised children, knows that growth happens precisely because relationships are imperfect. The friction matters. The misunderstandings matter. The repair matters. Without those things, there’s no development.

Human messiness isn’t a bug. It’s a feature.

But we’re being trained to believe otherwise. And this is part of a much larger pattern.

The same technologies that have isolated us are now selling us solutions to that isolation. It’s like a doughnut factory opening a weight-loss clinic across the street. The problem and the solution feed each other.


If you want a clear example of how easily this kind of re-engineering can happen, look at sexuality. For nearly all of human history, sexual energy existed between people. Now, a huge amount of it is diverted into images on screens. That shift happened quietly, quickly, and is now completely normalised.

The same thing is happening with relationships. AI friends. AI partners. AI therapists.

Late-stage capitalism monetises everything. When it runs out of physical resources, it turns to human needs. Connection becomes a product. Care becomes a subscription.

Therapy already exists within a market, yes. But it still contains something real. I genuinely care about my clients. That matters.

AI does not care. And the companies building these systems are not motivated by healing. They’re motivated by engagement, retention, and profit. Just like social media. Just like pornography. Just like everything else designed to hook attention.

The danger is that the more people consume simulated care, the more disconnected they feel, and the more they crave it. Like ultra-processed food, it leaves you emptier and wanting more.


That’s why I don’t see AI therapy as just a questionable idea. I see it as a very bad idea. Individually, relationally, and culturally.

It erodes what therapy actually is. And it points us toward a future where human connection is replaced by convincing imitations.

I hope this has made sense. If you have thoughts or experiences around this, I’d genuinely like to hear them. Thanks for being here. Take care of yourself.


Info about my 30min FREE Consultation

This free consultation is a relaxed, no-pressure conversation where we can slow things down and see what’s really going on for you. It gives you a chance to share what has brought you here, ask questions about how I work, and get a sense of whether this support feels right for you. My aim is to offer some early clarity, steadiness, and a sense of direction, without any obligation to continue. It’s simply a starting point to help you decide your next step with more confidence.


You can book a time that suits you via Calendly, making it easy to find a date and time that works around your schedule.


 
 
 

Comments


bottom of page