Jim Acosta Questions Fabricated AI Representation of Parkland Victim Joaquin Oliver | Parkland, Florida School Shooting

Jim Acosta Questions Fabricated AI Representation of Parkland Victim Joaquin Oliver | Parkland, Florida School Shooting Jim Acosta Questions Fabricated AI Representation of Parkland Victim Joaquin Oliver | Parkland, Florida School Shooting

Jim Acosta sparks backlash with AI avatar interview of Parkland shooting victim

Former CNN correspondent Jim Acosta posted a video interview on Monday with an AI-generated avatar of Joaquin Oliver, a 17-year-old killed in the 2018 Parkland school shooting. The avatar, created by Oliver’s parents, used a real photo and generative AI to simulate his voice and speech.

The conversation features the avatar answering Acosta’s questions in a stiff, monotone voice with jerky facial movements. Oliver’s AI says gun violence cut his life short and urges discussion for a safer future. Oliver would have turned 25 on the interview day.

Advertisement

Acosta teased the “one of a kind” interview on Twitter, calling it a “show you don’t want to miss.” He described the experience as “beautiful” after speaking with Oliver’s father, Manuel, who supports using AI to preserve his son’s voice but acknowledges it can’t bring him back.

“I really felt like I was speaking with Joaquin. It’s just a beautiful thing.”
Jim Acosta

“I understand this is an AI version of my son and I can’t bring him back, but it’s a blessing to hear his voice again.”
Manuel Oliver

The stunt quickly triggered pushback online. Critics said interviewing survivors could deliver more authentic messages. One user on Bluesky slammed the “completely made-up” avatar approach.

Acosta’s use of AI avatars follows prior Parkland victim projects. Last year, AI-generated voices of shooting victims called Congress demanding gun reform. Oliver was part of that campaign too.

The tech’s imperfections and ethical concerns roll on. Critics warn fake recreations risk misinformation, scams, and confusion over what’s real. Similar AI victim avatars have appeared in court hearings, drawing mixed reactions on the technology’s role in justice and memory.

Add a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement