AllVoices AI interview tool sparks backlash over robotic, tone-deaf experience
Emily Fenech, a marketing VP at AllVoices, tested an AI interview tool circulating in HR circles—and it didn’t go well. The voice was robotic, cold, and lacked any real emotional intelligence. Instead of a helpful screener, it felt like an awkward, gaslighting bot.
She ran a mock interview for an office manager role she’s obviously unqualified for. The AI kept praising her sarcastic answers with fake enthusiasm.
Emily called it out:
"I was just staring into this blankness with a robotic voice asking me high-stakes questions, with my future livelihood on the line."
The AI responded to her 25 years of experience with fake admiration. But when Fenech joked about “planning birthday parties and ordering toilet paper,” it still insisted she was impressive.
"Even though I was using sarcasm and making jokes because I was not qualified for the role, this robot kept telling me how impressive I was."
This tool left users feeling disconnected, unwilling to engage. Emily says it doesn’t read human cues or sarcasm. She fears it could harm actual hiring decisions.
Her take on AI in HR?
"AI is really good for structuring unstructured data, remembering things, and taking notes. But any conversation that requires emotional intelligence, please don’t use AI."
She endorses AI for simpler HR tasks like transcribing calls or tracking goals—just not interviewing.
"When it’s deciding from a pool of candidates who proceeds and who doesn’t, it’s obvious to see the potential for harm."
The post went viral on LinkedIn, with others confirming their companies use this or similar tools.
AllVoices is under scrutiny as users demand better emotional understanding from AI interview tech.