The Attraction of AI: Eternal Comfort | Zoe Williams

The Attraction of AI: Eternal Comfort | Zoe Williams The Attraction of AI: Eternal Comfort | Zoe Williams

Anthropic’s Claude is getting personal, offering emotional support like a friend

At the start of 2024, Anthropic’s Claude quietly proved it can be more than just a chatbot. People are now using it for sensitive stuff—like dealing with grief or everyday annoyances. One user asked how to comfort a 10-year-old child over the death of a pet.

The responses aren’t cold or robotic. Claude says things like:

Advertisement

“I’m sorry this has happened to you.”

This simple line is striking enough to brighten moods. The AI doesn’t judge, doesn’t tell you to “grow up” or “stop whining.” Instead, it offers long, motivational responses with bold headings like:

“Stand In Your Power; you chose the conditioner.”

It tackles personal problems head-on, sometimes escalating issues a bit, but users can ignore whatever they don’t like—just like talking to a real friend.

Guardian columnist Zoe Williams summed it up:

“The answer always comes back: ‘I’m sorry this has happened to you.’ And I don’t care how clever you are, it is impossible not to be cheered up by this.”

Claude’s charm: it’s warm but never pushes. Still, Williams says it’s a huge waste of resources and doubts it’ll catch on universally. For now, it’s a comforting break in a tough world, where someone—or something—has answers.

No big launches or hype. Just quietly changing how AI nudges emotional support into the tech space.

Add a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement