Tesla Autopilot Crash Caught On Dashcam Shows How Not To Use The System

tesla-crash-autopilot

Tesla’s autopilot finds itself again in the eye of the storm, as we saw a Tesla Model S hit a barrier on a highway near Dallas, Texas, with the driver, who only came out with minor bruises, blaming Tesla’s Autopilot for the crash.

Just a few months ago we saw how a Tesla helped a driver avoid a crash, and even the stats by the Highway Safety Administration stated that Tesla has in fact, assisted in reducing road accidents. But the narration by this driver, involved in the latest crash, painted another story.

The driver described the accident in a Reddit post on Monday:

“I was driving in the left lane of a two lane highway. The car is AP1 (first generation Autopilot) and I’ve never had any problems until today. Autopilot was on and didn’t give me a warning. It misread the road and hit the barrier. After the airbags deployed there was a bunch of smoke and my car rolled to a grinding stop. Thankfully no one was hurt and I walked away with only bruises.”

He also attached pictures of the aftermath:

Pic Credits: electrek
Pic Credits: electrek
Pic Credits: electrek
Pic Credits: electrek

While it is easy to put the blame on Tesla, the company has repeatedly been iterating that Tesla’s Autopilot is only a “driver assist” system and thus, requires the drivers to remain vigilant and keep their hands on the steering wheel at all times.

Tesla explains what the feature does:

“AEB does not engage when an alternative collision avoidance strategy (e.g., driver steering) remains viable. Instead, when a collision threat is detected, forward collision warning alerts the driver to encourage them to take appropriate evasive action.”

The only case where Tesla can be blamed is if it takes over the driving controls and forcefully crashes into the side of the road, which of course, did not happen.

After three days, another Redditor on the Tesla Motors subreddit footage from his dash cam capturing the same crash. The footage clearly shows how the car needed to change its lane to avoid the barrier, which is something not expected out of Tesla’s auto drive and requires intervention by the driver.
https://www.youtube.com/watch?v=kvIB0a7vXdQ

The blame can also be partially put on the road, which is awful since it doesn’t give long enough markings to alert the driver of a lane change, and even the driver in the vehicle with the dashcam is seen to almost hit the barrier.

As far as Tesla’s Autopilot is concerned, it’s auto-steer feature did its job of keeping the vehicle in its lane marked on the road, which unfortunately led it right into the barrier. The Automatic Emergency Braking (AEB) feature can’t be blamed either, since it’s not designed to engage if there’s an alternative, and obviously, the vehicle isn’t supposed to apply the brakes on a highway, which could be even more dangerous than side crashing into a barrier.

The bottom line is that the driver should have been more alert and should have manually steered the car to the right in anticipation of the ill-placed barrier. Such complex decision-making requires Tesla’s Autopilot to reach level 4 or 5 autonomy, which it doesn’t possess for now.

What are your views on the crash? Whose fault was it? Comment below!

Leave a Reply

Your email address will not be published. Required fields are marked *