Lebanon’s mental health crisis is pushing people to ChatGPT for help—not always with good results.
Years of war, economic collapse, and the latest Israeli bombardment left tens of thousands displaced and psychologically shattered. Therapy is expensive and scarce. So many, including 34-year-old mother Zainab Dhaher, turned to AI chatbots for support.
The problem started when Zainab tried ChatGPT “self-tests” on Facebook. The bot suggested severe conditions like PTSD, schizophrenia, and ADHD. Care worker Dr. Randa Baraja said this is becoming common.
“We’re observing a growing trend, especially among younger people, of turning to AI tools for emotional support,”
Dr. Randa Baraja, clinical psychologist at CPRM Clinic
She added a warning:
“ChatGPT doesn’t offer genuine emotional attunement. It cannot replicate the human connection necessary for healing. More dangerously, it can delay access to professional help.”
Zainab shared her experience:
“It shook me… I couldn’t afford therapy. I work at a beauty salon and earn $400 a month. Rent alone is $1,200. Therapy isn’t an option for people like me.
At first, ChatGPT seemed like an outlet… but its responses felt hollow. I was getting angrier after every conversation. It felt like shouting into a void.”
The misuse of AI for mental health hits a wider audience. Young Lebanese like fashion entrepreneur Sarah Rammal also leaned on ChatGPT to cope with trauma and loss. It helped at first but wasn’t a long-term fix.
Hotline responders are seeing a spike in calls from youth overwhelmed by war trauma and economic despair. Many mention turning to AI because it costs nothing and is always available.
Mental health NGOs are pushing free, clinical-grade apps like Step-by-Step instead. Still, the psychological toll of conflict and poverty runs deep.
Lebanon’s war and crisis may be paused, but the mental scars are worsening—and AI is becoming a risky emotional crutch for millions stuck in limbo.
Zainab put it bluntly:
“We left the war. But the war didn’t leave us.”