Call Us Today
Steve Kherkher - August 11, 2025
A Miami federal jury recently found that Tesla bore partial responsibility for a deadly 2019 crash involving its Autopilot driver-assist system and ordered the company to pay more than $240 million in damages to the victims and their family. The verdict – one of the largest against Tesla tied to its advanced driver assistance software – raises immediate questions about corporate accountability, the safety of semi-autonomous driving systems, and how victims can pursue justice when complex technology is involved.
At Kherkher Garcia, our Houston self-driving accident attorneys are prepared to help victims of accidents involving new automotive technology. If you have been injured, or have lost a loved one, in an autopilot or driverless vehicle, we want to help. Read below to learn more about these accidents and how victims can protect their rights.
According to media reports, the crash occurred in 2019 when a Tesla Model S operating with Enhanced Autopilot struck pedestrians, killing a 22-year-old woman and seriously injuring another person. Plaintiffs’ lawyers argued the vehicle’s Autopilot features were overpromised and unsafe for handling certain road scenarios. In particular, intersections and cross-traffic situations. Tesla’s defense emphasized driver error and speeding. The jury ultimately apportioned liability to Tesla and returned a verdict that included both compensatory and substantial punitive damages.
Tesla has said it intends to appeal the verdict. The ruling comes as federal regulators and safety agencies continue to scrutinize Tesla’s Autopilot and similar advanced driver assistance systems.
There are several reasons why verdicts like this one matter – not only for the victims of the particular accident – but for society at large. A few of the most notable reasons this Tesla verdict matters:
The jury’s decision rested in part on the idea that Tesla’s promotion and continued rollout of Autopilot/FSD features could create unreasonable expectations about the system’s abilities. When manufacturers market driver-assist tech in ways that suggest near-autonomy but the systems have real limitations, that gap can form the basis of product liability, negligence, and punitive damages claims.
Plaintiffs in the case alleged problems with Tesla’s handling of evidence and warnings about the system’s limits. Large verdicts often reflect not only the harm suffered but also jurors’ views of a company’s conduct before and after an accident – which can amplify damages.
The National Highway Traffic Safety Administration (NHTSA) and other federal investigators have previously identified hundreds of crashes involving Tesla’s Autopilot with a subset resulting in deaths, and have criticized gaps in driver engagement safeguards. That regulatory record strengthens plaintiffs’ arguments that manufacturers understood or should have understood foreseeable misuses or system limitations.
It helps to separate marketing language from technical reality. “Autopilot,” “Enhanced Autopilot,” and “Full Self-Driving (FSD)” are brand names for Tesla’s suite of driver assistance features. This is a set of automated controls that can steer, brake, and change lanes under certain conditions but still require a human driver to supervise and intervene. These systems are typically categorized as Level 2 automation (the human driver must remain engaged), not full autonomy. When drivers misunderstand system limits – or when the system’s limitations are not adequately guarded against – accidents can and do occur.
Common failure modes in real-world crashes include:
Those failure modes are central to both regulatory investigations and civil lawsuits.
Autopilot-involved crashes have produced the full spectrum of motor vehicle injuries:
Beyond medical costs and lost wages, victims may suffer long-term disability, diminished earning capacity, and profound emotional and family losses. These are categories that juries use to calculate compensatory awards. Where juries find egregious conduct or reckless marketing, punitive damages can dramatically increase total awards.
If you or a loved one is harmed in a crash involving driver-assist or AI driving features, take these immediate steps:
The recent verdict against Tesla is likely to encourage more cases against automakers, suppliers, and software developers when advanced driver assistance systems are involved in serious injury or death. It may prompt manufacturers to tighten design safeguards, improve driver engagement monitoring, change marketing language, or accelerate hardware/software fixes.
Regulators may also leverage high-profile verdicts to press for stricter standards or clearer consumer warnings. For victims, that means stronger legal leverage, but it also means litigation will increasingly require technical proof and expert testimony.
At Kherkher Garcia, we understand that modern vehicle cases blend traditional personal-injury law with cutting-edge technical issues. Our approach includes:
A: No. Tesla’s Autopilot, Enhanced Autopilot, and Full Self-Driving (FSD) features are considered Level 2 driver-assist systems. That means they can steer, accelerate, and brake under certain conditions, but they still require the driver’s full attention and readiness to take control at any time. There are other models and services that are considered “driverless”, however.
A: Not automatically. Liability depends on the facts of the crash, including whether the system malfunctioned, whether it was marketed or represented in a misleading way, and whether the driver was using it as intended. A thorough investigation and expert analysis are needed to determine if Tesla or another party bears responsibility.
A: Crucial evidence can include vehicle data logs, dashcam or surveillance footage, the police report, witness statements, and any over-the-air (OTA) software update records. It’s important to preserve this evidence as quickly as possible — some data may be overwritten within days.
A: Victims may suffer traumatic brain injuries, spinal cord injuries, fractures, internal organ damage, burns, and in the most severe cases, death. These injuries often result in significant medical costs, long-term care needs, and loss of income.
A: Yes. Large corporations can and do face product liability and wrongful death lawsuits when their products cause harm. With the right legal representation, you can hold them accountable — even if they have vast resources and experienced defense teams.
A: Statutes of limitations vary by state, but in Texas (where Kherkher Garcia is based), most personal injury claims must be filed within two years of the crash. Other states may have shorter or longer deadlines. Acting quickly ensures your attorney can preserve evidence and meet all filing requirements.
A: Our firm has experience handling high-profile, complex product liability and auto accident cases. We partner with technical experts, secure critical evidence, and build strong legal strategies to fight for full and fair compensation. We don’t back down from corporate defendants — no matter how big they are.
Time matters in these investigations. Evidence can be overwritten, and the legal window to bring claims is limited by statutes of limitation. If you or someone you love was injured in a crash involving Autopilot, FSD, or any other automated driving feature, contact Kherkher Garcia for a free consultation.
Our team will explain your rights, preserve crucial evidence, and help you understand the realistic options for pursuing compensation. We are not afraid to take on large corporations to protect your rights and pursue the justice you deserve.
Call Kherkher Garcia to start your free consultation at 713-333-1030. To request more information or a consultation, complete our online contact form. You don’t have to face this alone.
Image by rawpixel.com on Freepik
This page has been written, edited, and reviewed by a team of legal writers following our comprehensive editorial guidelines. This page was approved by attorneys Steve Kherkher and Jesus Garcia Jr., who have more than 50 years of combined legal experience championing the rights of those who have experienced catastrophic injury due to negligence.
Connect with a Kherkher Garcia trial lawyer today to pursue maximum compensation for your injury.