Tesla Model 3 on Autopilot slammed into the rear of an Indiana firetruck, killing Jenna Monet.

Posted in News

Tesla Model 3 Driver Sues in Death of Wife in Indiana Crash
Tesla Model 3 on Autopilot slammed into the rear of an Indiana firetruck, killing Jenna Monet.

— A Tesla Model 3 crash that killed a woman was allegedly caused by failures of the Autopilot system to prevent the Model 3 from slamming into the rear of a firetruck.

On December 29, 2019, Derrick Monet was driving a 2019 Tesla Model 3 with his wife Jenna as a passenger as they headed to Maryland from Arizona at about 8:11 a.m.

The Tesla Model 3 had Autopilot engaged when the vehicle slammed into the rear of a firetruck parked in the passing lane of I-70 in Putnam County, Indiana.

The firetruck had responded to an earlier accident and had its emergency lights activated when the Tesla crashed into the truck.

Jenna Monet, 23, died in the crash and her husband suffered multiple injuries, including fractures of the "lumbar vertebra, thoracic and cervical vertebrae, scapula, two ribs, and a right femur, which now has a rod implanted."

Even though the Tesla apparently took no evasive action to avoid the firetruck, plaintiff Mr. Monet says Tesla and multiple unnamed entities involved in the manufacture of the Model 3 caused the crash, injuries and damages to the plaintiff.

"Plaintiff is informed and believes, and thereupon alleges, that each of the Defendants designated herein as a DOE was and is negligently, carelessly, recklessly, unlawfully, tortuously, wantonly, wrongfully, illegally, or in some other actionable manner, responsible for the events and happenings referred to herein, and thereby negligently, carelessly, recklessly, unskillfully, unlawfully, tortuously, wantonly, wrongfully and illegally proximately caused the herein described incident and injuries and damages to the Plaintiff."

And according to the Tesla Model 3 Autopilot lawsuit:

"The plaintiff, 'reasonably believed that the Model 3 was safer than a human operated vehicle because of Tesla's claimed technical superiority regarding the Model 3's Autopilot system and its driver assistance features and because of Tesla's claim that all of the self-driving components engineered into the vehicle would prevent injury from driving into a fixed object of any kind.'"

However, the Tesla website and Model 3 owner's manual warns owners about using Autopilot and related features.

"Never depend on these components to keep you safe. It is the driver's responsibility to stay alert, drive safely, and be in control of the vehicle at all times." — Tesla Model 3 owner's manual

“Autopilot, Enhanced Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.” — Tesla's website

The Tesla Model 3 Autopilot lawsuit references a government investigation opened in August 2021 about Tesla vehicles with Autopilot engaged that crashed into stationary emergency vehicles.

The National Highway Traffic Safety Administration investigation includes at least 12 crashes that occurred when Tesla vehicles slammed into parked emergency vehicles that had their emergency lights activated.

The fatal Monet Tesla Model 3 crash is one of the incidents involved in the federal investigation.

In some cases road flares, illuminated arrows and road cones had been placed around the emergency scenes, yet Tesla drivers and the Autopilot systems failed to recognize the warnings.

NHTSA said one of the aspects of Autopilot that would be investigated is how drivers are monitored to ensure they are staying aware of their surroundings. And according to the lawsuit, Tesla failed in this regard because the 2019 Model 3 Autopilot system lulled Mr. Monet into a false sense of security while driving.

Tesla has also allegedly known for years that "stopping for stationary objects [has] been a particularly difficult problem for Autopilot and other vision-based systems like Mobileye in the real world, and numerous drivers have rear-ended stopped vehicles such as highway patrol cars or fire trucks."

In October 2021, federal safety regulators questioned why Tesla didn't issue a formal recall when the automaker sent over-the-air software updates to help the Autopilot systems detect flashing emergency lights on parked vehicles.

The Tesla Model 3 Autopilot lawsuit was filed in the U.S. District Court for the Northern District of California (San Jose): Derrick Monet, v. Tesla Inc., et al.

The plaintiff is represented by Arias Sanguinetti Wang & Torrijos, LLP, and the Slavik Law Firm, LLC.

A D V E R T I S E M E N T S

Become a Fan & Spread the Word