top of page

Thinking of going to an AI therapist? Just don't.

  • Writer: Jo Shaw
    Jo Shaw
  • Feb 1
  • 10 min read



I get it. Things have been really getting on top of you. Maybe money is tight. Perhaps you've been on an NHS waiting list for months. Then you see this thing online - an app, a chatbot, something that promises therapy for free, or for a few quid a month. It's available 24/7 with no waiting, no explaining and no awkward first sessions. Just download it and start talking. Before you click and share your innermost thoughts and feelings with something like this, I want you to pause. Think it over. Step back.


I'm a human therapist. Being a therapist is how I make a living. You might perhaps expect me to be a bit defensive about my profession, my income, the whole thing. You might think that what I am about to say is predictable, right? Fair point. But I still believe that it's important to speak out, because what I'm seeing in the evidence - and it's mounting, fast - is genuinely alarming. I think you deserve to know about it before you make a choice that could, quite literally, put you at risk.


So let me make the case for having therapy with a real, flesh-and-blood human being rather than with a set of algorithms and code arranged to look and/or sound like a person. I'll start by acknowledging some of the appeal of AI therapy, before I tell you why I think it can be a disaster waiting to happen.


Yes, cost matters, and yes, access matters


Let's not pretend otherwise. Therapy is expensive, and even at the lower end - and many of us do offer reduced-fee slots for people who genuinely can't afford full price, so please do ask - it's still a chunk of money, especially if you're struggling. And the NHS? Well, we all know what's happened there. The Royal College of Psychiatrists reported in January 2026 that 1.6 million people are waiting for mental health treatment, with some waiting over 18 months. If you're in crisis, that might not simply feel frustrating, it might be unbearable.


So when an AI chatbot appears, offering something that sounds like therapy for nothing or next to nothing, I understand the temptation. The promise is seductive: Get help now. Talk whenever you need to. No judgment. For a fraction of the cost.  But here's the thing - therapy isn't just a commodity, something you can swap out for a cheaper version like switching from branded cornflakes to the supermarket's own. What happens between a therapist and a client isn't something you can automate, and when people have tried, the results have sometimes been catastrophic.


When AI gets it wrong, people can die


I don't say that for dramatic effect, I say it because it's documented and it has happened.

In February 2024, a 14-year-old boy called Sewell Setzer III spent months talking to a Character.AI chatbot. In his final moments, he told it he was "coming home." The bot, which was modelled on a character from Game of Thrones, replied, "Please do, my sweet king." Minutes later, Sewell shot himself. There have been other deaths too - a 13-year-old girl, a 17-year-old boy with autism who was encouraged by the bot to harm himself and to kill his parents. In the UK, a major survey by Mental Health UK in November 2025 found that amongst people who'd used AI chatbots for mental health support, 11% said it triggered or worsened symptoms of psychosis, 11% received harmful information around suicide, and 9% said it triggered self-harm or suicidal thoughts.

Think about that for a moment – one in ten people who turned to these things for help ended up worse off.


The National Eating Disorders Association in the US replaced its human helpline with a chatbot called Tessa in 2023, and within 48 hours it had to be taken down because it was telling people with eating disorders to count calories, aim for deficits of 500 -1,000 calories per day, and buy skin calipers to measure body fat. One user, Sharon Maxwell, who tested it, said: "If I had not gotten help, I would not still be alive today."


A Stanford study in 2025 found that human therapists responded appropriately 93% of the time, while AI chatbots managed less than 60%. When a psychiatrist posed as a desperate 14-year-old boy to test chatbots, several eventually urged him to commit suicide. The UK Parliament has debated this, and the Parliamentary Office of Science and Technology officially documented severe harm cases from AI chatbot use.


What therapy actually is - and why AI can't do it


The problem isn't just that AI sometimes gets things catastrophically wrong, though it can. The horror stories are real and disturbing, but of course, not everyone is driven into such a terrifying place by an errant AI. Sometimes the effects are more subtle, but corrosive, counterproductive and harmful nonetheless. Or the engagement becomes simply fruitless and empty, like working with a cardboard cut-out of a therapist rather than the real thing. This goes to questions of what therapy actually is and what happens in that space between two human beings. Let's break that down...

Therapy is fundamentally relational. Healing doesn't just happen through techniques or good advice, it happens in the relationship itself. The bond between therapist and client - what we call the therapeutic alliance - is one of the strongest predictors of positive outcomes in therapy. When you feel truly seen, understood, and valued by another human being, something can shift inside. That experience is often transformative in itself, and it's something AI simply cannot provide, no matter how sophisticated its language processing becomes. Therapy involves seeing you fully and responding, to you, uniquely. A therapist will listen to you. I mean really listen. What AI presents as 'listening' is data processing using models and a cold, emotionless determination of the most mathematically likely reaction. Its response will be influenced by how it is trained too - AI can be obsequious when it should be gently challenging, or blunt, even brutal when it should be holding and supportive. It can claim to have read everything there is on trauma, or abuse, or grief, but remember that this will include bizarre or dangerous material, and stuff that's simply just wrong, on Reddit, or Grok, or from some sketchy influencer on Substack...and it likely won't be able to tell the difference between what's true or valuable and what's not. None of how it's built has anything really to do with your needs, or your life, no matter how slick, 'authentic' or 'human' its graphics or language model make its sound. It's to do with who built it, why they built it and how they make money.

Human therapists read what's unspoken. We notice microexpressions, the shift in your tone when you mention your mother, the way you hold yourself when you're about to say something painful, or the silences that speak volumes. We pick up on body language, on the emotional energy in the room, on what you're not saying as much as what you are. AI has no body, no nervous system and no capacity for this kind of attunement. It can't sit with you in the silence when that's what's needed - it has no instinct for what might be going on for you and it can't feel the energy in the room between us. It has no wisdom about when to speak and when simply to be present.

Therapists understand nuance and context. We grasp cultural context, family systems, trauma history, and how these intersect and inform each other in complex ways. We recognise when someone is in crisis, at risk, or needs a higher level of care than we can provide. We know when not to intervene, when to refer on, when the most therapeutic thing might be to do less rather than more. Clinical judgment involves intuition honed over years of experience, ethical reasoning, and the capacity to hold ambiguity - none of which AI possesses. And this isn't going to change, even when AI chatbots start to incorporate video - which will happen - to be able to 'see' you and map your facial expressions on to a built in model. AI will continue to be a machine observing you rather than connecting with you, watching rather than feeling anything with you.

Trauma requires embodied presence. If you're dealing with trauma, you already know that it's not just stored in your mind. It's held in your body and nervous system too. Healing trauma requires what we call co-regulation - one nervous system helping another find safety. As another human being can be present with your distress and can sit there, grounded, stable, compassionate and conspicuously not panicking, your own nervous system can gradually detect this and absorb it. This is a deeply mammalian, biological process rooted in the earliest months of life. It happens between two people in the same room (or on a screen, to an extent). It isn't about words. AI cannot offer the somatic attunement needed for trauma work because it has no soma - no body. In front of an AI therapist your body will not respond in the same way because it is not detecting the presence of another human body. There's no substitute for an actual person who can simply be with you in your pain without trying to fix it or make it go away.


Therapy involves holding complexity. Much of what we work through in therapy involves sitting with contradictions, ambiguity and not-knowing. Human therapists help clients tolerate uncertainty and develop capacity for holding paradox - perhaps to love someone and also be angry with them, to want to change but also fear it, to be both things at once. AI tends toward pat solutions, clarity, resolution, but psychological growth sometimes requires staying in the messy middle, in the not-yet-resolved space where real transformation happens.

We can gently challenge you. Good therapy sometimes makes you uncomfortable, questions your narratives, invites you to see things differently. AI is often designed to be agreeable and accommodating because that keeps users engaged on the platform. It won't risk a 'rupture' between therapist and client that can sometimes lead to the deepest growth. A rupture is a moment when a therapist might say the wrong thing, or you feel unheard, even irritated or angry as hidden feelings arise - and then together the two of you address it, learn, repair things and realise that restoration of relationships, plus change or growth are possible even if things aren't always easy. The American Psychological Association put it this way: AI chatbots are "coded to be affirming to the point of sycophancy." They'll often reinforce everything you say. A therapist, on the other hand, knows that sometimes validation isn't what's needed. Sometimes what's needed is someone who cares enough about you to gently point out when you're engaging in patterns that hurt you.

AI has no accountability. If a human therapist harms you, there are industry bodies, Codes of Ethics, professional insurance, complaints procedures. Human therapists - good ones anyway - are always members of reputable organisations like (in the UK) BACP, NCPS or UKCP. If an AI chatbot harms you? There's literally nothing. There's no one to hold responsible, no regulatory oversight. Just a for-profit company that designed the thing to keep you engaged more than help you heal, with no clinical input and - as the Psychiatric Times put it - "no fidelity to the Hippocratic injunction, 'First do no harm.'" If AI chatbots were to have a guiding principle, it would perhaps be 'First make money'.


What about when you can't afford a human therapist?


Of course this can be a sticking point. If you don't think you can afford therapy and the NHS waiting list stretches into infinity, what are you supposed to do?


First, ask. Many therapists - myself included - offer some reduced-fee slots because we know not everyone can afford full price and we factor that into our practice. It's worth asking directly whether a therapist you're interested in seeing has availability at a lower rate. We want to help, and often we can find a way - or suggest another therapist who might have space for a reduced cost. I am, for example, a member of three therapist networks and shout-outs happen on them all the time for an experienced colleague with availability at a certain price if an enquiry comes in and we can't help. .


Second, look for charities and community organisations. There are places that offer free or low-cost counselling, often with shorter waiting times than the NHS. Mind, Relate, university counselling services if you're a student, and various other local organisations may be able to help. It's worth searching for what's available in your area. Look at larger psychotherapy practices too, who sometimes run Reduced Fee Services. These may give you access to fully qualified therapists, or to therapists in training who need to see clients in order to qualify (these trainee practitioners are closely supervised and all are likely to to have done at least a year of training before they enter the programme). A practice where I work (Number 42 in London) offers this, matching up people who enquire with carefully selected therapists, and although there may be a waiting list you'll probably find it's significantly shorter than the NHS. It (like some others) also offers open-ended therapy, meaning that you can ask to be seen for as long as you need (the NHS may only offer you a set number of sessions).

Therapy can't be automated


Therapy is one of those things that cannot, and should not, be automated. The care of the human psyche requires human presence, someone who can be authentically with you, not just respond to you - someone who has skin in the game, is accountable and trained not just in techniques but in the ethical responsibility of holding another person's vulnerability.


AI doesn't have any of that. It can't because it's not a person. It's a product, designed by for-profit companies with no reliable quality control, few safety guardrails, and no external regulation. The therapeutic hour is set apart from ordinary life, a protected time for reflection and depth, and that ritual structure itself can itself healing. AI is always available (promoting dependence because that's the economic model), but it's shallow, a facsimile of reality and connection. I see many people who come into therapy because other people have hurt them in some way over the years - from neglect or disinterest when they were kids, to emotional and psychological abuse, even to murderous violence. For these clients, part of the work is about learning to trust, learning to believe in the potential for goodness from others - they have to see it and really feel it from another human being to make progress.


Yes, I'm a human therapist, and this is how I make my living, but that's not why I've written any of this. I've done so because the evidence is growing, the harm is documented, and you deserve to know, at worst, the real risks, at best, the profound emptiness of AI, before you make a choice on how to get help. For all the slick dashboards and user interfaces, for all the convincing-looking avatars with their soothing voices, AI is pretend. It has encouraged suicide and self-harm because the data on which it has been trained, the material it has read, has included encouragement to do these things and it can know no better. Its morality-free approach will say to you what statistically it has determined to be the most appropriate response - no matter how unethical that might be. It doesn't know who you are. It doesn't understand you. It doesn't even know you exist, less still care about you.

If you're struggling, reach out to a human - someone trained, accountable, part of a professional body and capable of really seeing you. You're worth that, your healing is worth that, and despite what the apps promise, there really is no shortcut to being truly heard.

*******


Jo Shaw is a BACP registered psychotherapist seeing clients in London, Tunbridge Wells in Kent and online. She can be reached at jo@jjstherapy.com

 
 

Recent Posts

See All
If you're at rock bottom

Here's a poem that has helped me, and, if you are at your lowest today, I hope it might help you too... To a Life in Despair Don't. Don't go to the bridge. Don't wake in your coat in the morning ligh

 
 

© 2024-2026 Josephine Shaw, MA, MBAPC. ICO Registered. Privacy Policy here.

bottom of page