Tesla is hit with a $329 million verdict over a deadly 2019 crash involving its Autopilot system. A Miami federal jury found Tesla 1/3 responsible for the crash that killed 20-year-old Naibel Benavides Leon and seriously injured her boyfriend.
The jury blamed the car’s driver for 2/3 of the accident. Neither the driver nor Autopilot braked in time before the car smashed through an intersection and hit an SUV, killing the pedestrian.
This verdict marks one of the first major legal losses for Tesla tied to Autopilot’s safety. The company previously settled similar lawsuits but lost this one after a three-week trial.
Plaintiffs’ lead attorney Brett Schreiber called out Tesla for letting drivers use Autopilot beyond controlled highways while CEO Elon Musk claimed it drove better than humans.
Brett Schreiber stated:
Tesla designed Autopilot “only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans.”
“Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm’s way,”
“Today’s verdict represents justice for Naibel’s tragic death and Dillon’s lifelong injuries, holding Tesla and Musk accountable for propping up the company’s trillion-dollar valuation with self-driving hype at the expense of human lives.”
Tesla said it will appeal, citing “substantial errors of law and irregularities at trial.”
The company pushed back in a statement:
“Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology,”
“To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver — from day one — admitted and accepted responsibility.”
Tesla and Musk have long boasted about Autopilot’s abilities, fueling overconfidence while regulators and safety boards have warned about misuse. The NTSB blamed Tesla in a 2020 report on a 2018 fatal crash that involved a distracted driver using Autopilot. Tesla largely ignored the board’s safety recommendations.
Musk himself admitted complacency is an issue:
“They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do,” Musk said in 2018.
The ruling comes as Tesla rolls out its Robotaxi fleet in Austin, Texas, using its upgraded Full Self-Driving system. The legal fallout is likely to add pressure on Tesla amid ongoing safety debates.