why the brand is recalling two million cars in the United States – L’Express

why the brand is recalling two million cars in the

This will not reassure the detractors of automatic piloting. The American electric car manufacturer Tesla has initiated the recall of some two million vehicles in the United States for a risk linked to their assisted driving system.

In a letter addressed to the Californian group owned by Elon Musk this Tuesday, December 12, the American Highway Safety Agency (NHTSA) indicates that in certain circumstances, the assisted driving function of Tesla vehicles may lend itself to misuse, causing incur an increased risk of collision.

READ ALSO >>From batteries to software, the “Tesla mafia” is attacking the industry of tomorrow

Specifically, the NHTSA investigation, which examined 956 crashes in which Autopilot was supposedly used, finds that the system’s design is likely to cause “inadequate driver engagement and usage controls.” , “which can lead to misuse of the system,” a spokesperson for the American agency said this Wednesday in an email to AFP.

If a driver uses driver assistance incorrectly, in poor conditions, or fails to recognize whether the function is activated, the risk of an accident could be higher, explains the NHTSA. For its part, Tesla acknowledged in its information report that the controls put in place on its autopilot system “may not be sufficient to prevent misuse by the driver”, according to the authority’s email.

A remote update

This is not the first time that “Autopilot”, Tesla’s assisted driving system, has been implicated in accidents. This one does not necessarily live up to its name very well: it is in fact supposed to help drivers manage acceleration, braking but also steering. But the car cannot be driven alone, without the intervention of a driver.

The NHTSA thus began, in 2021, an evaluation process to investigate 11 incidents involving stationary first aid vehicles and Tesla vehicles with the assisted driving system engaged. Consequently, and “without agreeing with the analysis” of the NHTSA, Tesla decided on December 5 to initiate “a recall for a software update”, explains the highway authority.

READ ALSO >>Automobile, an unprecedented strike in the United States: “The unions fear the Tesla model”

The vehicles affected are certain Model S produced between 2012 and 2023 and equipped with the system; all Model Xs produced between 2016 and 2023; all Model 3s produced between 2017 and 2023; and all Model Ys produced since 2020. The vehicles must receive a remote update, which was to be deployed from December 12, 2023. This should add, in particular, additional alerts to encourage drivers to maintain control of their vehicle, “which involves keeping your hands on the steering wheel,” notes the authority.

The automobile brand led by Elon Musk had also already carried out several recalls in the United States last year to remotely modify potentially problematic software.

Eight fatal or serious accidents since 2016

READ ALSO >>Ukraine, AI, working methods… The five things to remember from the book on Elon Musk

Tesla’s “Autopilot” therefore continues its controversies. Last week, a former Tesla employee explained in a interview given to the BBC that he did not believe assisted driving technology was safe enough to be used on public roads, and that his attempts to raise his concerns had been ignored internally.

Statements supported by figures: according to an investigation by the American daily The Washington Post, at least eight fatal or serious Tesla accidents have occurred since 2016 on roads where “Autopilot” should not have been activated, video evidence to boot. Because if the user manual for Tesla cars clearly specifies that “Autopilot is intended for use only on highways and private roads, with a fully attentive driver”, it continues to highlight that its assistance system driving is also very safe on all roads.

lep-life-health-03