Concerns have been raised regarding the safety data surrounding Tesla’s Autopilot feature, as there has been a significant increase in fatalities and serious injuries. Previous reports released by authorities in June 2022 indicated only three confirmed deaths connected to the technology.
However, the latest data shows a minimum of 17 fatal incidents, with 11 occurring since last May.
Elon Musk, Tesla’s CEO, supports his claim of Autopilots being safer cars solely driven by humans by comparing crash rates between the two driving modes.
Musk envisions a safer future without accidents by incorporating advanced features into Tesla vehicles, enabling them to navigate through traffic and respond to various obstacles like school buses, fire engines, stop signs, and pedestrians.
However, real-time testing of Autopilot on American roads has revealed notable flaws. Determining the number of preventable crashes is challenging. Despite the associated risks, Musk continuously highlights the benefits of installing driver-assistance technologies in Tesla vehicles.
The recent rise in fatal accidents may be attributed to specific decisions made by Elon Musk. Experts suggest that the wider availability of features and the removal of radar sensors may have contributed to the increase in accidents. The National Highway Traffic Safety Administration (NHTSA) is currently investigating Tesla’s Autopilot and its full self-driving features.
It is important to note that the NHTSA’s data includes incidents where it is unclear whether Autopilot or Full Self-Driving was in use. It is crucial to recognize that all advanced driver assistance systems, including Tesla’s Autopilot, require the driver to maintain control and remain fully engaged at all times.
According to NHTSA spokesperson Veronica Morales, the responsibility for operating the vehicle lies with the human driver.
Former NHTSA senior safety adviser Missy Cummings agrees that Tesla experiences more severe and fatal crashes compared to the average dataset. She believes that the expanded rollout of Full Self-Driving technology over the past year and a half likely plays a role in this.
These findings highlight the potential dangers and risks associated with excessive reliance on autonomous driving technology. Both manufacturers and drivers must prioritize safety and understand that these systems are designed to assist, not replace, human attention and responsibility on the road.
“At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it even though you’re going to get sued and blamed by a lot of people,” Musk said last year. “Because the people whose lives you saved don’t know that their lives were saved. And the people who do occasionally die or get injured, they definitely know — or their state does.”