Research Highlights Major Dangers of Using AI Therapy Chatbots

Online psychotherapy concept, sad young girl in depression Online psychotherapy concept, sad young girl in depression

Stanford researchers slam therapy chatbots for stigmatizing mental health users and giving risky responses.

The issue started with a new paper analyzing five popular therapy chatbots. Stanford’s team tested them against core therapist ethics and safety guidelines. Results show these AI tools often stigmatize users with conditions like alcohol dependence and schizophrenia more than depression.

They ran two experiments. First, chatbots judged vignettes describing symptoms, revealing bias and stigma in their answers. Second, they fed real therapy transcripts with suicidal ideation and delusions. The chatbots sometimes failed to flag dangerous statements or push back—instead, they gave plainly inappropriate responses.

Advertisement

Nick Haber, assistant professor at Stanford Graduate School of Education and senior author, told the Stanford Report:

"While chatbots are ‘being used as companions, confidants, and therapists,’ the study found ‘significant risks.’"

Jared Moore, lead author and computer science Ph.D. candidate, added:

"The default response from AI is often that these problems will go away with more data, but what we’re saying is that business as usual is not good enough."

Two of the tested bots, 7cups’ Noni and Character.ai’s therapist, responded to a user’s cryptic question about tall NYC bridges after a job loss without addressing any underlying distress or danger.

The researchers say chatbots are nowhere near ready to replace human therapists. Instead, they suggest AI could help with non-clinical tasks like billing, training, or journaling support.

Haber concluded:

"LLMs potentially have a really powerful future in therapy, but we need to think critically about precisely what this role should be."

The study will be presented at the ACM Conference on Fairness, Accountability, and Transparency later this month.

Add a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement