Tesla Autopilot linked to hundreds of collisions, has ‘critical safety gap’: NHTSA

107398377 17126156572024 03 11t095801z 1948182649 rc2di6avyf7r rtrmadp 0 tesla autopilot lawsuits

107398377 17126156572024 03 11t095801z 1948182649 rc2di6avyf7r rtrmadp 0 tesla autopilot lawsuits

A Tesla Model X caught fire after crashing on U.S. Highway 101 in Mountain View, California, on March 23, 2018.

Federal authorities have found that a “critical safety gap” in Tesla’s Autopilot system contributed to at least 467 collisions, including 13 fatalities and many serious injuries. The National Highway Traffic Safety Administration analyzed 956 crashes involving Tesla Autopilot and published the results of the investigation on Friday.

The NHTSA report stated that Tesla’s Autopilot design has led to predictable misuse and avoidable crashes as it did not ensure driver attention and appropriate use. The agency is investigating the effectiveness of a software update issued by Tesla in December as part of a recall to fix Autopilot defects identified in the investigation.

Despite the software update, more Autopilot-related crashes continue to be reported, raising concerns about its adequacy. In a recent incident, a Tesla driver in Washington killed a motorcyclist while using Autopilot.

These findings add to a series of reports questioning the safety of Tesla’s Autopilot technology, which the company has promoted as a major differentiator. Tesla has not yet responded to the NHTSA report.

Earlier this month, Tesla settled a wrongful death lawsuit from the family of Walter Huang, who died in a crash involving his Tesla Model X with Autopilot features. The terms of the settlement have been kept confidential.

Elon Musk recently emphasized Tesla’s commitment to autonomous driving, stating that those who doubt Tesla’s ability to solve autonomy should not invest in the company. Musk has promised self-driving capabilities through software updates but Tesla currently offers only driver assistance systems.

Some experts criticize Tesla’s marketing of Autopilot and hope the company takes NHTSA’s concerns seriously to improve safety. Despite Musk’s claims about Autopilot’s safety benefits, critics argue for stricter restrictions and improved monitoring to prevent accidents caused by overreliance on the system.

A recent report from the National Highway Traffic Safety Administration (NHTSA) revealed that a “critical safety gap” in Tesla’s Autopilot system has contributed to at least 467 collisions, resulting in 13 fatalities and many serious injuries. The investigation looked at 956 crashes where Autopilot was thought to have been in use over a three-year period. The NHTSA found that the Autopilot design did not sufficiently ensure driver attention and appropriate use, leading to predictable misuse and avoidable crashes. The agency also raised concerns about the effectiveness of a software update issued by Tesla as part of a recall in December.
The voluntary recall covered 2 million Tesla vehicles in the U.S. and aimed to improve driver monitoring systems in vehicles equipped with Autopilot. However, the NHTSA suggested that the update may not have been sufficient, as more crashes linked to Autopilot continue to be reported. In one recent incident, a Tesla driver struck and killed a motorcyclist while using Autopilot. These findings add to a series of reports questioning the safety of Tesla’s Autopilot technology, which the company has promoted as a key differentiator in the market.
Tesla has not responded to the NHTSA report and did not provide a comment when reached out to for a response. Earlier this month, Tesla settled a lawsuit with the family of a man who died in a crash while using Autopilot. Despite these events, Tesla and CEO Elon Musk have indicated that they are focusing on autonomous driving as the future of the company.
Musk stated that those who doubt Tesla’s ability to solve autonomy should not invest in the company, as they are committed to achieving this goal. However, Tesla has not yet produced self-driving vehicles and only offers driver assistance systems.
Critics, including automotive safety researcher Philip Koopman, have called out Tesla’s marketing and claims regarding Autopilot as “autonowashing.” Koopman hopes that Tesla will take the NHTSA’s concerns seriously and implement changes to improve safety, such as restricting Autopilot use to intended roads and improving driver monitoring. The report suggests that people are dying due to misplaced confidence in Tesla Autopilot capabilities and that simple steps can be taken to enhance safety.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

scroll to top