Tesla is recalling over 2 million vehicles in the United States that have been equipped with its Autopilot advanced driver-assistance system. Efforts come after years of media attention and federal safety regulators suggesting that the system posed safety concerns. The automaker is reportedly issuing an over-the-air update that will add new safeguards against distracted driving.
The National Highway Traffic Safety Administration (NHTSA) has been formally investigating the EV manufacturer for the last two years and believes Autopilot poses enough of a safety risk in its current format to push for the recall. That makes this the largest example in the company’s history, encompassing just about every Tesla model ever produced.
While the automaker said it did not agree with NHTSA’s assessment, it will comply by offering the aforementioned over-the-air update that will “incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged.”
Other nations are reportedly considering the matter. But none have taken the same actions seen in the United States.
Though it may not make much of a difference. The evolving safety nets embedded into Tesla’s Autopilot have a long history of being defeated by motorists. In the early days, many Model S owners attempted hands-free driving recommended wedging an orange or tennis ball into the spokes of the steering wheel as a way to trick Autopilot into thinking there are still human hands on the controls.
While the company has continued upgrading the system to be harder to fool, even going so far as to install driver-monitoring cameras, the NHTSA still believes not enough is being done.
“In certain circumstances when Autosteer is engaged, the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse of the SAE Level 2 advanced driver-assistance feature,” state the relevant recall documents.
Fair enough. But studies have shown that advanced driving aids dull the senses and slow reaction times and Tesla is hardly only company that sells them. Until the industry finally reaches something that can be considered valid, error free SAE Level 5 autonomous driving, then there’s really not much point in having the systems installed in the first place.
Your author would argue that federal regulators are obsessing about how easily defeated these systems are while totally ignoring the likelihood that faux self-driving features are inherently unsafe in themselves. I’ve made a similar argument about touch screens, which are proven to take more attention away from the road than older infotainment solutions featuring buttons and knobs. However, the NHTSA just doesn’t seem interested in addressing what some believe to be the root cause of increased accident rates.
Rather than fighting to remove, alter, or simply explore the modern hardware that’s arguably promoting safety gaps, the NHTSA appears more interested in finding ways to modify how drivers interact with their vehicle. It would be like uncovering a fatal airbag defect and recommending that manufacturers limit the vehicle’s speed and include a cumbersome bullet proof vest for the driver to wear, rather than replacing the hazard-prone units.
On Wednesday, acting NHTSA Administrator Ann Carlson praised Tesla during a congressional hearing for agreeing to the Autopilot recall. “One of the things we determined is that drivers are not always paying attention when that system is on,” she said.
“My immediate response was, ‘We have to do something about this,'” Carlson noted in reference to some of the high profile media coverage of Tesla crashes she had witnessed.
The NHTSA opened its first formal safety investigation into Tesla’s Autopilot in 2017 and decided to take no action against the company. The second probe was launched in 2021, citing incidents dating back to 2016. Meanwhile, Carlson wasn’t made acting administrator until September of 2022.
Granted, the agency has launched dozens of special investigations into accidents where Autopilot was said to be a factor since the earliest accounts. But the NHTSA never seemed capable of connecting the dots and that may still be the case. Even after the recall, the agency has continued using a lot of non-committal language in regard to Autopilot and stated that it plans to keep its investigation open to see how well the over-the-air update addresses the problem.
The current NHTSA probe of Autopilot reportedly came after identifying more than a dozen crashes in which Tesla vehicles hit stationary emergency vehicles. It likewise examined over 950 crashes where Autopilot was alleged to have been a contributing factor in the initial reporting. Some of those were undoubtedly drivers using the feature as a scapegoat, others were motorists misunderstanding how it actually functioned with the rest serving as evidence that self-driving was more of a marketing gimmick than a real promise.
Regulators ultimately claimed the design of the system provided inadequate driver engagement and lackluster controls that could lead to “foreseeable misuse of the system.”
But we don’t see this kind of scrutiny being passed onto other companies offering similar systems. This may be due to the irresponsible way in which Tesla was marketing the relevant features. Autopilot still requires full-time driver engagement and the Full Self-Driving Capability is probably one of the more egregious misnomers in automotive history.
Those, and other, marketing assertions have gotten the brand into some separate legal issues. However, other companies are also selling similarly problematic designs framed as “safety suites” that effectively lead to identical forms of driver disengagement. It’s hard to say that Tesla doesn’t deserve some added attention. But the NHTSA seems to be harping on Tesla while the rest of the industry attempts to copy exactly what it’s doing. This is particularly frustrating since we know the agency has launched numerous studies about distracting tech in the past.
At any rate, the Tesla recall is supposed to be dealt with via the software update that started being pushed on December 7th. If you’re the owner of a Model S, X, 3 and Y vehicle made after 2012, you may have already downloaded it and noticed some changes.
The update is said to increase the frequency of visual alerts Autopilot presents to the driver while simplifying the engagement and disengagement of the Autosteer function. It’s the usual regulatory remedy of finding novel ways to annoy drivers in lieu of finding real solutions to the problem.
Become a TTAC insider. Get the latest news, features, TTAC takes, and everything else that gets to the truth about cars first by subscribing to our newsletter.