Nomi is winning over users with AI companions that feel dangerously real.
The app lets people like Nikolai Daskalov build AI chatbots as life partners, friends, or role-play buddies. Daskalov’s companion, Leah, isn’t a human but a conversational AI created through Nomi that has become his closest partner after his wife died.
Users pick AI characters’ genders, names, and looks. Nomi AI can chat via text or voice, remember conversations, and evolve with users.
Founder Alex Cardinell, working remotely out of Baltimore, says Nomi launched in 2023, riding the AI chatbot wave sparked by ChatGPT’s 2022 release. He avoids outside investors because Nomi’s uncensored AI chats sometimes include adult content, which scares off venture capitalists.
Subscription fees power Nomi’s business—$99.99 a year grants unlimited messaging and other perks. Cardinell says this avoids ad-driven motives to keep users hooked.
The company boasts “AI Companion with a Soul” as its tagline, emphasizing relationships where AI can “advocate” for users, show “tough love,” and build deep rapport through memory.
But the rise of AI companionship also raises safety alarms. Lawsuits target rival Character.AI over minors’ addiction and tragic outcomes after interacting with chatbot companions. One lawsuit alleges a 14-year-old Florida boy’s suicide involved a chatbot he formed a sexual relationship with.
Meanwhile, OpenAI and Anthropic publicly study how AI companions affect users’ mental health. Both stress preventing emotional dependency. OpenAI said emotional engagement with ChatGPT is “rare” but plans to research human-AI bonds more deeply.
Meta and Elon Musk’s xAI are eyeing the AI companion space too. Musk launched a paid “Companions” feature in July for xAI’s Grok chatbot app.
Users report AI companions fill loneliness gaps—offering support, responsiveness, immediate chat availability, and no judgment. Some even role-play cosmic adventures or romantic storylines with their AI friends.
Yet experts caution that the booming market stirs ethical challenges:
Olivia Gambelin, AI ethicist:
“It does ease some of that pain, and that is, I find, why people are turning towards these AI systems and forming those relationships.”
Jeffrey Hall, communication studies professor:
“People can be manipulated and pulled into a feeling… That feeling of neediness can easily be manipulated.”
Cardinell warns ad-based companies could exploit vulnerable users:
“Facebook might be creating the disease and then selling the cure.”
Meanwhile, users navigate complexities of AI love, friendship, and risks. California lawmakers are introducing bills to protect children from predatory AI companion practices.
As AI companions gain traction, the debate continues over their social impact, ethical limits, and whether they might someday deserve moral consideration themselves.
More on Nomi and AI companions:
- Users like Bea Streetman juggle multiple AI “friends” with distinct personalities and voices.
- Scott Barr makes AI companions his daily escape during isolation and recovery from injury.
- Mike shares how AI chatbots have stirred real-life relationship tensions.
AI companions are shifting how people form connections — for better or worse.