What Does the Tesla Settlement Mean for Future Autonomous Vehicle Claims?

Reviewed by Louis Patino, JD, DC

dr louis patino personal injury lawyer

Louis Patino, JD, DC
A former U.S. Army Combat Medic, Dr. Louis Patino is a distinguished attorney recognised by Top Attorneys of America, Expertise, and the American Institute of Trial Lawyers. He has a Doctor of Jurisprudence from Texas Southern University and a Doctor of Chiropractic from Parker College of Chiropractic.

Image of a Tesla Model 3 EV car, the subject of a recent autonomous vehicle claim causing wrongful death.

On March 23, 2018, Apple engineer Walter Huang died when his Tesla Model X, traveling at 71 mph, crashed into a concrete barrier on U.S. Highway 101 in Mountain View, California.

A wrongful death lawsuit filed by Huang’s family alleged that the vehicle’s Autopilot feature failed to activate automatic emergency braking, citing a design defect in the driver assistance system.

But after nearly five years of litigation — and just days before opening statements were slated to begin — Tesla settled the case.

Here, San Antonio and McAllen personal injury lawyer Dr. Louis Patino explores the case, highlights why Tesla would settle the claim at the eleventh hour, and speculates what this might mean more broadly for autonomous vehicle claims.

What Happened?

38-year-old Walter Huang was killed on his way to work when his Model X SUV swerved off a California highway into a concrete barrier at 71 miles per hour.

The ensuing wrongful death lawsuit claimed that Tesla did not train the Autopilot feature — which was active — to detect barriers where one highway merges with another.

Tesla vehemently denied liability, arguing that Huang caused the crash through “highly extraordinary misuse.”

Tesla alleged that Huang over-relied on the Autopilot feature and was playing a video game on his phone while driving. They stated that Autopilot enables its cars to steer, accelerate, and brake automatically but that the driver assistance system does not make them autonomous and requires “active driver supervision”. The company’s court filings demonstrated that Huang’s hands were only detected on the steering wheel 66% of the 19 minutes before the crash occurred — approximately 13 minutes — and were not detected in the six seconds leading up to the fatal accident.

Tesla further contended that Huang knew the Autopilot feature did not make his vehicle autonomous and chose to play a video game during rush-hour traffic anyway, but Huang’s surviving family claimed in their lawsuit that the engineer believed his Model X was safer than a human-operated vehicle; interestingly, Tesla founder Elon Musk claimed Autopilot was “probably better” than human drivers — back in 2016.

Tesla’s case revolved around this issue of cell phone use; Huang was not paying full attention to the road before the crash, supposedly preventing him from taking evasive action and manually overriding the assist system to avoid crashing.

But a second named party in the lawsuit provided an alternative route of recovering compensation for the plaintiff — and a potential scapegoat for Tesla.

The lawsuit accused the California Department of Transportation (Caltrans) of contributing to Huang’s death by failing to maintain a crash attenuator at the site of the accident. This roadside safety device dissipates the energy created upon impact and stops motorists from hitting roadside hazards such as the concrete divider Huang’s Tesla hit in 2018. Unfortunately, the crash attenuator was not operational — a vehicle had struck it almost two weeks before Huang’s accident, and Caltrans left it unrepaired. This oversight meant that the device did little to reduce the impact that caused Huang’s fatal injuries.

How Strong Was the Plaintiff’s Case?

It takes significant resources to take on Tesla in court. After five years of discovery (evidence gathering), motions, and continuances, with Tesla never budging from its stance — until the private settlement, at least — would the plaintiffs’ contention be enough to win at trial?

Tesla’s previous record would have reasonably cast doubt on a favorable outcome.

Huang’s tragic accident was not a one-off. In August 2021, three years after Huang’s crash, the National Highway Traffic Safety Administration (NHTSA) launched an investigation into the vehicle manufacturer over concerns about its Autopilot feature. Three years later, on April 26th, 2024, the safety agency identified at least 13 fatal Tesla crashes and many more causing serious injuries involving Autopilot.

Several of these accidents led to civil lawsuits filed against Tesla.

In 2019, Micah Lee was operating his Tesla Model 3 with Autopilot engaged when it suddenly veered off a highway at 65 mph, struck a palm tree, and burst into flames.

The crash killed Lee and severely injured two passengers, including an eight-year-old boy. Lee’s family and the injured passengers sued Tesla, alleging the company knew its safety system was defective.

Tesla disputed whether Autopilot was engaged during the crash and blamed the driver for being intoxicated.

After a grueling trial — in which the plaintiffs asked for $400 million in compensatory damages plus punitive damages — the jury returned a verdict in favor of Tesla.

This success came after Tesla won a personal injury trial against motorist Justine Hsu, who claimed the Autopilot in her Tesla Model S swerved the vehicle into a curb, causing the airbag to deploy so violently that it fractured her jaw, knocked out several teeth, and caused nerve damage. Jurors determined that Autopilot — and, by extension, Tesla — was not responsible and its vehicles were not fully self-driving — despite the argument that motorists might assume so from its name.

With Tesla’s track record of success in court, the chance of Huang’s family succeeding in holding Tesla accountable for their loved one’s death seemed slim. And, with Elon Musk’s adamance that “[Tesla] will never surrender/settle an unjust case against us, even if we will probably lose,” a settlement seemed highly unlikely.

It’s common for personal injury lawsuits to settle weeks, even days before a trial would have started. Sometimes, initial negotiations fail but mediation results in an agreement. Other times, a defendant might not want to take the risk of a jury finding in favor of the plaintiff and determining substantial damages — a $10 million private settlement is preferable to a $50 million judgment in court — especially once all the evidence of the plaintiff’s case is “on the table”.

Why, then, did Tesla settle the case just days before jury selection was scheduled, especially given the precedent of successful cases in Tesla’s favor and billions of dollars at its disposal for its defense?

We do not know the details of the settlement, and we can only speculate how the trial would have proceeded, but multiple arguments would have put the plaintiff in a solid position to argue that Tesla’s failure to mitigate the risk of misuse amounted to negligence.

First is the outcome of the NHTSA’s investigation. In its findings, the safety organization cited that the monitoring system was defective and did not effectively ensure drivers were paying sufficient attention to the road with Autopilot engaged. The agency further found the Autopilot system — including its name — could give drivers a false sense of security.

Further supporting the case was a notice issued in December 2023, when Tesla was forced to recall more than two million vehicles equipped with Autopilot in North America. The fix involved issuing a software update to increase so-called “Autopilot nag” — repeated warnings to drivers to keep their hands on the steering wheel.

These issues raise a broader question that likely would have been posed during the trial: Could Tesla have reasonably foreseen that motorists would misuse the Autopilot system?

In an email sent just weeks before Huang’s accident, Tesla President Jon McNeill reported he was “immersed” in reading emails and taking calls with the Autopilot engaged.

Together, these circumstances start to form a central argument: Tesla claimed in its response to the lawsuit that motorists should avoid distraction and keep their hands on the wheel even when using the Autopilot feature because Tesla cars are not autonomous. However, if the Tesla President relied on the feature to safely steer his vehicle while his eyes were not on the road, it stands to reason other drivers might assume the same. Indeed, the NHTSA’s investigation raised this same concern, and Tesla’s actions to increase the frequency of warnings to drivers indicate the company could reasonably foresee drivers might overly rely on Autopilot.

What Does the Tesla Settlement Mean for the Future of Autonomous Driving?

In Tesla’s motion to the court disclosing a confidential settlement had been agreed, the company continued to deny liability and doubled down on its argument that Mr. Huang’s actions were the sole cause of the crash and that his fatal injuries were a result of Caltrans’ negligence in failing to repair the previously damaged attenuator. Neither party has disclosed how much the case settled for — and we may never know — but Tesla did state that it was “within the ballpark” of Tesla’s potential liability.

Tesla further stated its motivation for settling the case was to “buy peace” and ensure the plaintiffs would be taken care of. It is an admirable sentiment given the unbearable heartache suffered by Huang’s surviving family — and arguably uncharacteristic of a company that has aggressively fought against personal injury lawsuits filed by other victims.

We can only take Tesla at its word. However, a skeptic might argue that there was more to it and that the manufacturer opted to negotiate a settlement because it believed there was a reasonable chance a jury would find them at least partly liable, or to save face and avoid potential scrutiny, reputational damage, and impact on consumer sentiment in what would have been a highly publicized trial, especially in the wake of Tesla’s Robotaxi launch in August 2024.

However, we should not rush to assume that this will make it easier for other plaintiffs injured or who have lost loved ones in Tesla Autopilot-related auto accidents to recover compensation.

If anything, such an outcome will make Tesla double down on its design, manufacturing, and marketing standards to prevent further litigation.

It comes down to one central truth: any motor vehicle (capable of traveling 100-plus miles per hour) is an inherently dangerous product. Merely warning of the dangers of not being engaged while operating these cars is not enough: Tesla must be certain its programming is not defective.

The impact of this verdict in the court of public opinion remains to be seen. It may make some consumers wary of the driver-assisted technology.

On balance, the media never reports on the many successful trips motorists take using driver assistance systems like Tesla’s Autopilot — we only ever hear when something goes catastrophically wrong.

How Common Are Autonomous Driving Crashes in Texas?

Based on data from the Texas Department of Transportation’s Crash Records Information System (C.R.I.S.), there were 1,468 reportable crashes involving autonomous vehicles in 2023 (from April 2023 onward, when this data became available).

Most resulted in no injuries to any party involved:

Crash Severity Number of Crashes
Unknown 21
Not Injured 997
Possible Injury 248
Suspected Minor Injury 163
Suspected Serious Injury 31
Fatal Injury 8

Of the 450 crashes causing suspected or confirmed injuries (possible, suspected minor, suspected serious, and fatal), nine people suffered fatal injuries:

Person Injury Severity Number of People Injured
Unknown 36
Not Injured 682
Possible Injury 430
Suspected Minor Injury 230
Suspected Serious Injury 42
Fatal Injury 9

In these crashes, 871 individuals were in an “Autonomous Unit” (an interesting use of terminology given Tesla’s stance on its vehicles not being autonomous):

Driver Type Number of Individuals
Driver 560
Passenger/Occupant 308
Driver of Motorcycle-Type Vehicle 3

By excluding passengers, we can determine the most popular makes and models of “autonomous” vehicles based on how many are driven by motorists.

Tesla took the lead, followed by Ford, Toyota, Chevrolet, and Nissan.

Vehicle Make (Top 10) Number of Vehicles
Tesla 94
Ford 67
Toyota 65
Chevrolet 58
Nissan 47
Honda 29
Jeep 19
Kia 19
Dodge 18
Hyundai 17

Finally, of the 94 Teslas involved, 47 were cited as having the “Driver Assistance” autonomous level engaged during the crash. Just two crashes caused suspected serious injuries to five drivers or passengers; nobody died in a Tesla accident where the driver assistance system was active in Texas in 2023.

These numbers will inevitably increase in the years to come. Right now, autonomous vehicles represent a tiny portion of all vehicles on Texas roads, adopted by the few with the disposable income to purchase the vehicle and the faith in the relatively new technology to navigate them safely to their destinations.

Time will tell how much of an impact vehicles using driver-assisted systems will have on the safety of Texas roads and across the country as the technology becomes ever more sophisticated and affordable to the masses. However, outcomes like the Tesla settlement and the stark reminder of the lives lost may make motorists more wary about using it.

If you or a loved one have been injured in an accident involving an autonomous vehicle, or in any other auto accident, book a free, no-obligation case review with our car accident lawyer in San Antonio and McAllen. You could recover substantial compensation for your injuries and losses.

Previous Post
Police Alcohol Tests: How to Prove a Driver Was Drunk during Your Accident in Texas
Next Post
Can Passengers Sue for Car Accident Injuries in Texas?

Request a FREE Consultation

Schedule Your Free Consultation With Our Experienced Injury Attorneys

Name
This field is for validation purposes and should be left unchanged.

Veteran Law Firm Serving Veterans and Their Families with Honor