DETROIT — Tesla recalls nearly all vehicles sold in the U.S. (more than 2 million) to update software and fix a flaw in the system that forces drivers to use Autopilot more carefully do.
However, at this time, the Tesla Semi, a Class 8 all-electric tractor, is not included in this recall.
The update will provide more warnings and warnings to drivers and limit the areas in which the basic version of Autopilot can operate, according to a document posted by U.S. safety regulators on Dec. 13.
The recall was announced after a two-year investigation by the National Highway Traffic Safety Administration into a series of crashes that occurred while using Autopilot, a partially autonomous driving system. Some of them were deadly.
The agency said its investigation found that Autopilot’s methods of determining whether drivers are paying attention may be inadequate and could lead to “foreseeable abuse of the system.” Says.
The added controls and warnings “further encourage drivers to comply with ongoing driving responsibilities,” the document states.
But safety experts say that while the recall is a good move, the onus still falls on the driver and there are fundamental problems with Tesla’s automated system, which detects obstacles in its path and stops the vehicle. said that it has not been resolved.
The recall applies to models Y, S, 3 and X produced between October 5, 2012 and December 7 of this year. This update was sent to some affected vehicles on December 12th, and the remaining vehicles were scheduled to be sent later.
Tesla stock fell more than 3% in trading before Dec. 13, but recovered amid a broader stock market rally and ended the day up 1%.
Attempts to address Autopilot’s flaws have led to the death of Dillon Angulo, who was seriously injured in a 2019 crash of a Tesla vehicle using Autopilot technology on a rural stretch of highway in Florida, where the software was not installed. It seemed like a case of too little, too late. It should be deployed.
“This technology is unsafe and must be removed from the roads,” said Angulo, who is currently recovering from injuries including a brain trauma and broken bones in his lawsuit against Tesla. “The government has to do something about it. We can’t do experiments like this.”
Autopilot includes a feature called Autosteer and Traffic-Aware Cruise Control, and Autosteer can operate on limited-access highways, along with a more advanced feature called Autosteer on city roads. It is intended to be used when there is no
Software updates limit where Autosteer can be used. “If the driver attempts to activate Autosteer when the activation conditions are not met, the feature will alert the driver through visual and audible warnings that it is not available and Autosteer will not activate,” the recall document says. It is stated in.
Depending on Tesla’s hardware, added controls may include “becoming more visible” for visual alerts, simplifying how to turn Autosteer on and off, and driving off controlled access roads and traffic control devices. Includes an additional check if autosteer is being used when approaching. The document states that drivers who repeatedly fail to “demonstrate continued and sustained driving responsibility” may be suspended from using Autosteer.
According to recall documents, agency investigators have met with Tesla since October to explain their “preliminary conclusions” regarding fixes to the surveillance system. Tesla disagreed with NHTSA’s analysis, but agreed to a recall on December 5 to resolve the investigation.
Auto safety advocates have long called for stronger regulation of driver monitoring systems, which primarily detect whether a driver’s hands are on the steering wheel. They’re calling for cameras to see if drivers are paying attention, and other automakers with similar systems are using them.
Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies the safety of self-driving cars, said the software update could be due to a lack of night vision cameras to monitor drivers’ eyes or something that Tesla couldn’t detect. He said it was a compromise that did not solve the problem. And if there is an obstacle, it will stop.
“This compromise is unfortunate because it doesn’t solve the problem that older cars don’t have the proper hardware to monitor drivers,” Koopman said.
Koopman and Michael Brooks, executive director of the nonprofit Center for Auto Safety, argue that crashes with emergency vehicles are a safety deficiency that is not being addressed. “It doesn’t really dig into the root of what the investigation is looking at,” Brooks said. “It doesn’t answer the question of why Tesla on Autopilot doesn’t detect and respond to emergency activity.”
Koopman said NHTSA appears to have decided that a software change is the best it can get from the company, saying, “The benefits of doing this outweigh the cost of spending another year in a dispute with Tesla.” said.
NHTSA said in a Dec. 13 statement that the investigation is still ongoing “as we continue to monitor the effectiveness of Tesla’s relief efforts and work with the automaker to ensure the highest level of safety.” .
Autopilot can automatically steer, accelerate, and brake within its lane, but it is a driver assistance system and, contrary to its name, cannot drive itself. Independent tests have shown that surveillance systems can be easily tricked, with drivers caught driving drunk or sitting in the back seat.
Tesla said in a defect report filed with safety authorities that Autopilot’s controls “may not be sufficient to prevent driver misuse.”
A message seeking further comment was left with the Austin, Texas, company in the early morning hours of Dec. 13.
Tesla says on its website that Autopilot and more sophisticated fully self-driving systems are intended to assist drivers who are ready to intervene at any time. Full self-driving is being tested on public roads by Tesla owners.
In a statement posted to X (formerly Twitter) on December 11, Tesla said that when Autopilot is activated, safety will be enhanced.
Since 2016, NHTSA has sent investigators to 35 Tesla crashes in which the vehicle was allegedly operating on an autonomous system. At least 17 people died.
The investigation is part of a larger investigation by NHTSA into multiple incidents in which Tesla vehicles using Autopilot collided with emergency vehicles. NHTSA has become more aggressive in pursuing Tesla safety issues, including a recall of its fully self-driving software.
Transportation Secretary Pete Buttigieg, whose agency includes NHTSA, said in May that Tesla should not call a system that cannot drive itself Autopilot.
The Associated Press is an independent, global news organization dedicated to fact-based reporting. Founded in 1846, AP remains the most trusted source of fast, accurate and unbiased news in all its formats and a key provider of the technologies and services essential to the news business. The Trucker Media Group is a subscriber to The Associated Press and is licensed to use this content on TheTrucker.com and The Trucker newspaper pursuant to a content license agreement with The Associated Press.