Can AI Replace Therapists? And More Importantly, Should It?

Image may contain Sonny Zhou Leisure Activities Person Sport Swimming Water Water Sports Clothing Glove and Nature
Photo: Carla Rossi

Can machines think? It’s a question that mathematician Alan Turing first posed in 1950 and became the cornerstone of his experiment, known as the Turing test, in which a human and a machine are presented with the same dilemma. If the machine could imitate human behavior, it was considered intelligent, something Turing predicted would increasingly happen in the decades to come. He didn’t have to wait long: By the 1960s, MIT professor Joseph Weizenbaum had introduced the world to Eliza, the first chatbot and forebearer of modern AI—and Eliza was programmed to imitate a psychotherapist. But Turing’s question feels more prescient than ever now, as we find ourselves at a disconcerting crossroads with technology advancing and extending its reach into the various touchpoints of our lives at a rate so quick that the guardrails haven’t yet been created to corral it.

In 2025, Turing’s initial question has evolved into something different, though: Can machines feel or understand feelings? Because, as increasing numbers of people turn toward AI in lieu of a human therapist, we are asking them to do just that.

The technology has come a long way since Eliza. Now, you have options like Pi, which bills itself as “your personal AI, designed to be supportive, smart and there for you anytime.” Or Replika, “which is always here to listen and talk.” There’s also Woebot, Earkick, Wysa, and Therabot; the list goes on if you’re just looking for someone—well, something—to talk to. Some of these chatbots have been developed with the help of mental health professionals while others haven’t, and it’s hard for the average client to discern which is which.

One reason that more people are turning to AI for mental health help is the cost; sessions with a human therapist (whether virtual or in-person) can be pricey and are often either not covered by insurance or require a lot of extra effort to navigate coverage. For younger generations, recession-proofing their budget has meant ditching a real therapist for a bot stand-in.

Then there’s the lingering stigma around seeking out mental health help. “Many families, whether it be because of culture or religion or just ingrained beliefs, are passing down stigmatized views about therapy and mental health through generations,” says Brigid Donahue, a licensed clinical social worker and EMDR therapist in LA.

And there’s the convenience factor: This new wave of mental health tools is available on your schedule. (In fact, that’s Woebot’s tagline.) “Your AI therapist will never go on vacation, never call out or cancel a session,” says Vienna Pharaon, a marriage and family therapist and author of The Origins of You. “They’re available 24/7. It creates this perfect experience where you’ll never be let down. But the truth is you don’t heal through perfection.”

That healing often comes with the ruptures, friction, and tension of a therapy session that isn’t automated. “When you eliminate imperfection and human flaws and the natural disappointments that will occur, we really rob clients of the experience of moving through challenges and conflicts,” says Pharaon. The so-called imperfections of a human therapist can actually be reassuring for many clients. “For anyone who grew up with the intense pressure to be perfect, ‘mistakes’ made by a therapist can actually be corrective,” adds Donahue.

One of the main reasons younger generations are using AI for therapy is that leaning into technology is a natural choice for them. “Having grown up with some sort of device as a fifth limb, their first instinct for everything is their phone,” says Alyssa Petersel, a licensed social worker and the founder and CEO of MyWellbeing, a service that helps match people with a therapist. And, says Donahue, Gen Z also spent formative years during the pandemic, removed from in-person interactions and relying on smartphones and social media for connectivity, so AI therapy would seem like a natural progression. This generation’s first instinct is often to turn online, not to other humans.

The reason why this instinct is particularly troubling for therapists is that younger people are simply more emotionally vulnerable. That has a lot to do with where they are developmentally, says Petersel: They have not yet developed the part of their brain that supports decision-making that is independent of others’ opinions. Add to that a very persuasive device delivering confident advice that young people are eagerly taking, and you have a perfect storm.

In the most extreme cases, that storm has led users interacting with chatbots to harm others or themselves. A lawsuit has been filed by two sets of Texas parents against Character.AI (an app where users can build fictional characters) after their children using it ended up having devastating consequences. While these are extreme examples, there is a risk for every young person wielding AI in a therapeutic fashion. “You’re losing the muscle and ability to learn, test, and make decisions for yourself, and younger people are in a particularly vulnerable position because they’re simply more believing in tech,” says Petersel, who contrasts this with people over 40 who, because of more life experience (some of it lived without any devices), tend to have more spaciousness in their thinking and a healthy amount of skepticism.

Even in a mental health context, AI is not all bad, though. “We have to really understand the risks and the limitations while also honoring the benefits because there definitely are many,” says Pharaon. One is that AI is able to swiftly and efficiently review medical records and documents. Petersel gives one example of populating pages and pages of stream-of-consciousness journal entries into AI, then asking, “Can you tell me in a few sentences what you think my opinion is based on all these?”

But AI can only be as good as the prompts we provide. For example, as Petersel notes, if you tell the AI chatbot that you want to work on developing gratitude but aren’t sure how, it may come back with 10 suggestions. But what if one suggestion is to take a walk around the block and the block you live on happens to be unsafe? Or what if another suggestion is to reach out to your dad and you have a really complicated relationship with him? You need to have the discernment to make that judgment yourself, says Petersel. An experiment published earlier this year in PLOS Mental Health positioned ChatGPT against a human therapist; it found that participants could rarely tell the difference between responses written by ChatGPT and those by a therapist; the ChatGPT responses were also rated higher.

While studies like these raise considerations for the integration of AI in therapeutic settings with the appropriate boundaries and extensive oversight, of course there are still concerns. Chatbots don’t understand nuance, says Pharaon, nor can they provide the same context for a situation that a real therapist can. There is also an immediacy to the AI approach that, while appealing in theory, says Donahue, doesn’t allow for healing from grief or trauma on an appropriate human timeline. While the AI advice may feel right in the moment, what does this approach do for clients long-term? “When this level of insight is right in our pocket, what impact does that have on our own ability to look within ourselves and make an informed decision based on our own experiences and values?” Petersel wonders.

Building these relationships with AI and seeking out comfort from a chatbot instead of an actual person can reinforce a sense of isolation and impact our ability to connect IRL. While the threat of artificial intelligence to human intelligence is being discussed ever more frequently (particularly in the context of education and how this technology is decimating our ability to learn and think critically), just as important is considering the dangers it poses for human connection. And connection is something that is central to therapy and its efficacy.

“On a chemical, physical, and energetic level, being in the presence of another human is a big part of care,” says Petersel, “even virtually.” That connection can be especially powerful for those who feel disconnected from their own emotional response, says Donahue. While there can be a therapeutic place for AI, it should not, according to experts, ever be someone’s sole provider. Something to remember as the current administration continues to give carte blanche to big tech companies. (Most recently, Republicans sneakily inserted language into a budget reconciliation bill banning states from regulating AI in any capacity for an entire decade.)

AI is not all good nor all bad, but it won’t ever be able to replace the value of human connection in therapy. “AI really shortchanges someone from experiencing the beauty and complexity of growing and learning through human relationships,” says Donahue. “We need human connection to survive. People need people, period.”