Florida judge rules there is 'reasonable evidence' CEO Elon Musk and company officials knew the Autopilot system was defective but promoted it anyway
Tesla CEO Elon Musk and other managers at the automaker knew the company's cars had a defective Autopilot system, a Florida judge has ruled. In the ruling, issued last week but only published now, the judge found "reasonable evidence" that Tesla allowed its unsafe cars to be driven on the roads.
The ruling allows the complainants to file suit against Tesla for punitive damages by alleging "intentional misconduct and gross negligence" by the automaker. It is setback for Tesla, which is facing multiple investigations from US authorities but earlier this year won similar lawsuits in California.
The Florida lawsuit concerns a 2019 crash in which a Model 3 vehicle drove itself under the trailer of an 18-wheel truck that had turned on to the road. The accident cut through the vehicle horizontally, shearing the car's roof and killing the driver.
The judge said the Florida accident was "eerily similar" to another fatal accident, from 2016, in which a Model S fitted with Tesla's Autopilot system failed to detect crossing trucks, causing the vehicle to go underneath a trailer at high speed.
"It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the 'Autopilot' failing to detect cross traffic," the judge wrote.
Although the system in both vehicles is called Autopilot, they are different softwares — the Autopilot steering the Model S in 2016 was based on technology from Israeli startup Mobileye (The company has since been acquired by Intel). Media reports say the accident was partly responsible for Tesla parting ways with Mobileye.
The Autopilot in the Florida vehicle was a second-generation software developed in-house by Tesla.
Stellantis, CATL sign agreement over LFP batteries, explore joint venture
The judge also found evidence Tesla "engaged in a marketing strategy that painted the products as autonomous" and that public statements by CEO Elon Musk about the technology "had a significant effect on the belief about the capabilities of the products."
He cited a 2016 Tesla video that showed a car driven without human intervention. At the start, the video contains a disclaimer saying the person shown in the driver's seat is only present for legal reasons. "The car is driving itself," it said. The judge wrote in his ruling that "Absent from this video is any indication that the video is aspirational or that this technology doesn't currently exist in the market."
Earlier this year, Tesla emerged victorious in one of its first US trials over Autopilot's involvement in a fatal crash. That case involved a 2019 accident in which a Tesla veered off the road and hit a tree, fatally injuring its driver.
A California jury found there was no manufacturing defect in Autopilot. However, the jury's focus was narrow — they considered the possibility of defect in that particular vehicle, rather than the overall system — limiting implications in future lawsuits.
Tesla has stated that vehicles running on Autopilot — which can cost up to $15,000 per vehicle — still require human oversight. But the Florida judge has ruled that the complainant, wife of the deceased driver, can argue to jurors that Tesla's warnings in its manuals and agreements were inadequate.
Are the screws tightening against driverless technology? Just last month General Motors suspended operations at Cruise, its driverless car unit, after California regulators banned its vehicles from state roads.
The Golden State's Department of Motor Vehicles (DMV) said Cruise had "misrepresented" the technology's safety and called its driverless vehicles a risk to the public. Cruise subsequently shuttered operations across the US — the company had a presence in in Phoenix, Houston, Austin, Dallas and Miami — and its CEO has resigned.
Germany's BASF, Korea's SK On agree to jointly evaluate global lithium-ion battery market
Read More