DETROIT (AP) — Tesla is recalling nearly all vehicles sold in the U.S., more than 2 million, to update software and repair a broken system designed to ensure drivers pay attention when using Autopilot.
Documents released Wednesday by U.S. safety regulators say the update will increase warnings and warnings for drivers and even limit the areas where basic versions of Autopilot can be used.
The recall comes after a two-year investigation by the National Highway Traffic Safety Administration into a series of accidents that occurred while the Autopilot semi-automated driving system was in use. Some were fatal.
The agency says its investigation found that Autopilot’s method of ensuring drivers are paying attention may be inadequate and could lead to “foreseeable abuse of the system.”
The additional controls and warnings will “further encourage the driver to meet their ongoing driving responsibilities,” the documents say.
However, safety experts said that while the recall was a good step, it still held drivers responsible and did not address the underlying problem that Tesla’s automated systems have in detecting and stopping obstacles in their path.
The recall affects models Y, S, 3 and X manufactured between October 5, 2012 and December 7 of this year. The update should be sent to certain affected vehicles on Tuesday, with the rest receiving it later.
Tesla shares fell more than 3% in early trading on Wednesday, but recovered amid a broad stock market rally to end the day up 1%.
Trying to fix the flaws in Autopilot seemed like a case of too little, too late to Dillon Angulo, who was involved in a 2019 accident involving a Tesla that was using the technology on a rural stretch of highway in Florida where the software wasn’t present was, was seriously injured and is to be deployed.
“This technology is not safe, we need to take it off the road,” said Angulo, who is suing Tesla as he recovers from injuries including brain trauma and broken bones. “The government has to do something about it. We can’t experiment like this.”
The Autopilot includes features called Autosteer and Traffic Aware Cruise Control, with Autosteer intended for use on limited-access highways when not operating on city streets with a more sophisticated feature called Autosteer.
The software update will restrict the use of Autosteer. “If the driver attempts to activate the automatic steering function even though the conditions for activation are not met, the function visually and audibly alerts the driver that the steering function is not available and the automatic steering function is not activated,” it says the recall documents.
Depending on a Tesla’s hardware, the additional controls include “increasing prominence” of visual warnings, making it easier to turn Autosteer on and off, and additional checks on whether Autosteer is used outside controlled access roads and when approaching traffic control devices. A driver could be banned from using Autosteer if they repeatedly fail to “demonstrate consistent and sustained driving responsibility,” the documents say.
According to recall documents, agency investigators met with Tesla starting in October to explain “preliminary conclusions” about repairing the monitoring system. Tesla disagreed with NHTSA’s analysis, but agreed to the recall on December 5 to settle the investigation.
Car safety advocates have for years called for greater regulation of the driver monitoring system, which primarily detects whether the driver’s hands are on the steering wheel. They have called for cameras to ensure the driver is paying attention, like those used by other automakers with similar systems.
Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies autonomous vehicle safety, called the software update a compromise that doesn’t address the lack of night vision cameras to monitor drivers’ eyes or the lack of detection of Teslas and stop when there are obstacles.
“The compromise is disappointing because it does not address the problem that the older cars do not have sufficient driver monitoring hardware,” Koopman said.
Koopman and Michael Brooks, executive director of the nonprofit Center for Auto Safety, contend that the collision with emergency vehicles is a safety flaw that is not being addressed. “It’s not about digging to the root of the investigation,” Brooks said. “It doesn’t answer the question, why don’t Teslas with Autopilot detect and respond to emergency activity?”
Koopman said NHTSA apparently decided the software change was the best it could get from the company, “and the benefits of doing so now outweigh the costs of fighting with Tesla for another year.” “
In its statement Wednesday, NHTSA said the investigation remains open “as we monitor the effectiveness of Tesla’s remedial actions and continue to work with the automaker to ensure the highest level of safety.”
The autopilot can steer, accelerate and brake automatically in its lane, but is a driver assistance system and, contrary to its name, cannot drive itself. Independent tests have found that the surveillance system is so easy to deceive that drivers have been caught driving drunk or even sitting in the back seat.
In its deficiency report filed with the safety agency, Tesla said Autopilot’s controls “may not be sufficient to prevent driver abuse.”
A message was left early Wednesday seeking further comment from the Austin, Texas-based company.
Tesla says on its website that Autopilot and a more sophisticated Full Self-Driving system are intended to help drivers who need to be ready to intervene at any time. “Full Self Driving” is being tested by Tesla owners on public roads.
In a statement posted on X, formerly Twitter, on Monday, Tesla said safety is higher when Autopilot is engaged.
NHTSA has sent investigators to 35 Tesla accidents since 2016 where the agency suspects the vehicles were operating on an automated system. At least 17 people were killed.
The investigations are part of a larger investigation by NHTSA into several cases in which Teslas with Autopilot crashed into emergency vehicles. NHTSA has become more aggressive in pursuing safety issues with Teslas, including a recall of its Full Self Driving software.
In May, Transportation Secretary Pete Buttigieg, whose department includes NHTSA, said Tesla shouldn’t call the system Autopilot because it can’t drive itself.
—-
AP Technology writer Michael Liedtke contributed to this story.