Atai Life Sciences founder Christian Angermayer says AI could support psychedelic therapy with motivational check-ins between sessions. He stresses AI should only augment, not replace, human therapists during trips.
The idea: AI backing lifestyle changes voluntarily. Direct patient trips, though, need trained pros ready for help.
Meanwhile, AI mental wellness app Alterd lets users “chat with your mind,” a custom bot reflecting personal thoughts and moods.
Sam Suchin, creator of Alterd, explained the AI isn’t just ChatGPT: it analyzes mood, journal entries, and emotional tone for tailored insights. Its goal is to nudge users away from harmful behaviors like substance overuse.
User Trey says Alterd helped him stay off alcohol by giving deep self-awareness.
"This app and everything else is giving me deep self-awareness," Trey said.
"I have become able to observe my thoughts, feelings, and impulses without judgement or spiraling."
But experts warn of big risks handing AI tools control in sensitive therapy spaces.
UC San Francisco neuroscientist Manesh Girn flagged the lack of emotional attunement and nervous system regulation by AI.
"These are both central to therapeutic rapport, which research indicates is essential to positive outcomes with psychedelic therapy."
"Exclusively relying on a disembodied and potentially tone-deaf agent, rather than an attuned human presence, has a high potential for harm."
ChatGPT already has reports of triggering psychosis in online forums, without psychedelics involved.
OpenAI emphasizes their chatbot is not a substitute for professional care.
"Its models are taught to remind users of the importance of real-world human connection and professional guidance," a spokesperson said.
"Usage policies require users to comply with the law and not cause harm to themselves or others."