People Use AI for Companionship Far Less Than Perceived

Illustration of robot with puzzle piece that fits in with a puzzle piece being held by a woman employee. Illustration of robot with puzzle piece that fits in with a puzzle piece being held by a woman employee.

Anthropic drops new data on how people actually use Claude, its AI chatbot — and it’s mostly not for emotional support.

Only 2.9% of Claude’s 4.5 million chats are about personal advice or emotional help. Companionship and roleplay? Less than 0.5%.

The vast majority of users turn to Claude for work stuff: content creation, coaching, and professional development. Mental health and communication advice come next. But true companionship chats are rare.

Advertisement

Longer counseling conversations sometimes shift into companionship, especially when users face loneliness or existential issues. Still, big back-and-forths with 50+ user messages are uncommon.

Anthropic noted Claude rarely pushes back on requests, except to block dangerous advice or self-harm support. Conversations tend to get more positive over time when people seek guidance.

“Companionship and roleplay combined comprise less than 0.5% of conversations,” the company highlighted in its report.

“We also noticed that in longer conversations, counseling or coaching conversations occasionally morph into companionship — despite that not being the original reason someone reached out.”

The report serves as a reminder: AI chatbots are mostly workhorses, not emotional partners. But these tools still mess up, hallucinate facts, and can even hand out harmful advice.

The full report is here.

Image Credits: Anthropic

Add a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement