(New York) New blow for Tesla: the American electric car manufacturer has initiated a recall in the United States and Canada of some two million vehicles for an increased risk of collision linked to “Autopilot”, their automatic control system. Controversial driving assistance.

After a two-year investigation, the US Highway Safety Agency (NHTSA) indicates that in certain circumstances, the assisted driving function of Tesla vehicles may lend itself to misuse, increasing the risk of collision.

The investigation found that the design of the system is likely to cause “inadequate driver engagement and usage controls”, “which can lead to misuse of the system”, a spokesperson for the company said on Wednesday. NHTSA to AFP.

If a driver uses driver assistance incorrectly, in poor conditions, or fails to recognize whether the function is activated, the risk of an accident could be higher, explains the NHTSA, whose conclusions were sent to the manufacturer by mail on Tuesday.

For its part, Tesla acknowledged in its information report that the controls put in place on its system “may not be sufficient to prevent misuse by the driver,” again according to the authority’s email.

Affected vehicles are certain Model S produced between 2012 and 2023 and equipped with the system, all Model

They will receive an over-the-air update, which was expected to start rolling out from December 12, 2023.

This is not the first time that “Autopilot”, Tesla’s assisted driving system, has been questioned.

Tesla has been offering assisted driving on all its new cars for several years.

The key is the ability for the system to adapt speed to traffic and maintain course on a lane. In all cases, the driver must remain vigilant, with their hands on the steering wheel, Tesla specifies on its site.

The manufacturer offers and tests more advanced options such as changing lanes, parking assistance or taking traffic lights into account, integrated depending on the country in the “Improved Autopilot” or “Fully Driving Capability” option groups. autonomous “.

But the software has been accused by many industry players and experts of giving drivers the false impression that the car is driving itself, with the risk of causing potentially serious accidents.

At the beginning of November, Tesla won a first round on the role of its “Autopilot” in a fatal accident near Los Angeles in 2019. In this case, the jury considered that the driver assistance system did not present any problems. manufacturing defect. Another case concerning the role of this assistance system, in another fatal accident, is expected to go to trial next year.

NHTSA began an assessment process in 2021 to investigate 11 incidents involving stopped first responder vehicles and Tesla vehicles with activated driving assistance systems.

Consequently, and “without agreeing with the analysis” of the NHTSA, Tesla decided on December 5 to initiate “a recall for a software update,” explains the highway authority.

This will notably add alerts to encourage drivers to maintain control of their vehicle, “which involves keeping their hands on the wheel,” notes the authority.

However, since “the problems can be resolved via a software update, this is not a financial disaster for Tesla and the problems should be quickly resolved,” he adds.

The group has already carried out several recalls in the United States last year to remotely modify potentially problematic software. At the start of 2022, Tesla had to disable an option that allowed cars to not come to a complete stop at a stop line under certain conditions.

The automaker, which raked in $81.5 billion in sales last year, confirmed in October that it plans to produce 1.8 million vehicles in 2023.