Lawsuit blaming Tesla’s Autopilot for driver’s death may go to trial, judge rules

FORT LAUDERDALE, Fla. (AP) — A jury should decide whether Tesla and Elon Musk exaggerated the capabilities of the electric car maker’s Autopilot system and caused the fatal accident of a software engineer who turned it on, took his hands off the steering wheel and turned it off later crashed into a truck, a judge in Florida has ruled.

District Judge Reid Scott denied Tesla’s request for summary dismissal of Kim Banner’s lawsuit accusing the company of causing the death of her husband, Jeremy Banner, in 2019. In a 23-page ruling, Scott found that Kim Banner’s lawyers had presented sufficient evidence to take the case to trial sometime next year. Scott also noted that Banner can seek punitive damages from the company, which could reach millions of dollars if awarded.

Citing other fatal accidents involving Autopilot, Scott wrote last week that there is a “real dispute” over whether Tesla “created a foreseeable danger zone that posed a general risk of harm to others.” Autopilot is supposed to automatically steer and brake the car when activated.

The judge had ordered his convictions sealed, but they were mistakenly available on the Palm Beach County Clerk of Courts website Wednesday. They were removed shortly after The Associated Press obtained the verdict.

Tesla attorney Whitney Cruz declined to comment Wednesday and the company did not respond to an email. Musk eliminated Tesla’s media and PR department four years ago.

Banner attorney Trey Lytal said in a statement Wednesday that Scott’s verdict “shows that Tesla’s conduct was not only negligent, but involved intentional and reckless decisions that resulted in the deaths of customers, including Jeremy Banner.” He believes that Scott will release his decision in full soon.

“The public has a right to know these results and we firmly believe that will happen in the next few weeks,” Lytal said.

In denying Tesla’s request, Scott focused on the company’s marketing and Musk’s comments about Autopilot, pointing to other deaths that have occurred while using Autopilot. The company says in court documents that it warns drivers that its cars are not fully autonomous, that they still need to pay attention to the road and that they are ultimately responsible for steering and braking.

The story goes on

But Scott agreed that Banner’s lawyers had presented enough evidence to move forward with the case. Banner’s lawyers have argued that by naming the system “Autopilot,” Musk and Tesla were implying that the cars were self-driving and did not require the driver’s full attention. They also cite numerous comments Musk made years before 50-year-old Jeremy Banner’s accident in which he said Autopilot was already better than human drivers and would soon be autonomous.

The lawyers also point to a 2016 marketing video for Autopilot that is still on the company’s website. It begins with a statement: “The person in the driver’s seat is only there for legal reasons. He does nothing. The car drives itself.”

The Tesla then maneuvers through a city on winding roads in traffic jams. It stops at traffic lights and stop signs, avoids other cars, pedestrians and cyclists, and speeds up and slows down accordingly. It then parks itself parallel. The camera is positioned to show that the man in the driver’s seat never touches the steering wheel or pedals.

Under questioning by Banner’s lawyers, Tesla employees revealed that the car in the ad was programmed with mapping software not available to the public and “still performed poorly and even ran into a fence during filming.” The video required multiple takes and was heavily edited. say the lawyers.

Reid wrote that after reviewing the evidence, he could not imagine “why some ordinary consumers would not believe that Tesla vehicles were capable of hands-free driving.”

And that’s exactly what Jeremy Banner did.

Just before dawn on March 1, 2019, he headed to work on a semi-rural Florida highway in his 2018 Tesla Model 3, which he had purchased months earlier.

While traveling at nearly 70 mph, Banner activated the autopilot and took his hands off the steering wheel. To his right came a tractor-trailer pulling away from a farm. The Tesla didn’t recognize it and neither it nor Banner braked or swerved. Ten seconds after the autopilot was activated, the car drove under the trailer, ripping off the hood and killing Banner instantly.

The National Transportation Safety Board, which investigated the accident, said the truck driver was primarily responsible for veering into traffic, but also said Banner and Tesla were to blame.

“An alert driver would have seen the truck in time to take evasive action,” the NTSB said of Banner. The executive said Tesla’s Autopilot should have safeguards that do not allow the system to be used on highways with cross traffic. The car should also ensure that drivers using Autopilot keep their hands busy on the steering wheel.

“The NTSB and researchers found that drivers are poor at monitoring automation and do not perform well on tasks that require passive vigilance,” the 2020 report said.

The trucking company has already reached a confidential settlement with Kim Banner and is no longer part of the lawsuit.