Site icon Wonderful Engineering

Watch The Heart-Stopping Moment A Tesla Owner Nearly Plowed Into A Moving Train In ‘Self-Drive’ Mode

In a recent incident, an Ohio Tesla owner, Craig Doty II, encountered a perilous situation while utilizing his vehicle’s Full Self-Driving (FSD) feature. Driving during the night, Doty’s Tesla approached a passing train without decelerating, prompting him to intervene just in time to evade a collision.

This occurrence isn’t unusual as Tesla has faced numerous legal actions alleging that its FSD or Autopilot features caused accidents, sometimes resulting in fatalities. As of April 2024, Tesla models equipped with Autopilot systems were implicated in 17 fatalities and 736 crashes since 2019, as reported by the National Highway Traffic Safety Administration (NHTSA).

Doty documented the incident online, disclosing that his Tesla attempted to collide with a passing train twice within six months while in FSD mode. Despite his efforts to seek legal representation, Doty struggled to find assistance due to his minimal injuries resulting from the incident.

Tesla recommends not using the FSD system in darkness or adverse weather since these conditions can impair its effectiveness. The FSD depends on several sensors, such as ultrasonic and radar, to identify obstacles and navigate. However, when visibility is poor, these sensors might fail to function properly, greatly raising the chance of accidents.

Despite the admonitions and instances such as Doty’s, some users persist in placing trust in the FSD system, presuming it will execute as anticipated. However, such complacency can precipitate hazardous situations, as evidenced by Doty’s ordeal.

‘After using the FSD system for a while, you tend to trust it to perform correctly, much like you would with adaptive cruise control,’ he said. ‘You assume the vehicle will slow down when approaching a slower car in front until it doesn’t, and you’re suddenly forced to take control.

‘This complacency can build up over time due to the system usually performing as expected, making incidents like this particularly concerning.’

Despite Tesla’s manual emphasizing that drivers should stay alert and be ready to take over control of the vehicle instead of relying solely on the FSD feature, accidents linked to Tesla’s Autopilot persist. The NHTSA has connected some crashes to a “weak driver engagement system.”

To address these worries, Tesla released a software update to make both the Autopilot and FSD systems better but people are still unsure if these updates really work. Some experts say that they should add more safety features, like only letting Autopilot work on certain roads using maps.

While Tesla and CEO Elon Musk maintain that their self-driving systems are safe, citing lower accident rates among Autopilot users compared to traditional vehicles, detractors caution against excessive reliance on these systems. They emphasize the importance of ongoing improvements and regulatory supervision in the evolution and implementation of autonomous driving technology.

Exit mobile version