Site icon Wonderful Engineering

New Tests Conducted On Tesla’s Autonomous Driving System Reveal Its Weaknesses

Researchers from Israeli institute found loophole in Tesla's autonomous mode

Tesla has the autonomous mode as its focus selling point; however, there are numerous incidences upon which calling it a self-driven car manufacturer would be a false statement.

It is high time that Tesla starts classifying its vehicles semi-autonomous as its claim has been defied in several accidents and tests.

The technology in question is once again fooled in the latest test conducted by a team of researchers from Ben Gurion University Of The Negev, Israel. The test circulates around altering the billboards on the road with a stop sign or a speed limit and Tesla’s auto-driving feature detecting those and reacting accordingly. It could be another key defect in the technology marking it unadvisable for use at large.

The images of a stop sign or a speed limit, even for a fraction of a second, proved to be altering the autonomous feature’s detecting systems.

Testing Through Phantom Attacks

As per the researchers, split-second phantom attacks were conducted upon which Tesla’s model X considers a depthless object even if flashed on a billboard for a few milliseconds.

The damage could be done with ease by hacking into the internet-connected digital signboards on the road. All that someone with an evil mind would need is a remote hacker.

The researchers’ team embedded a stop sign into an ad being played on the billboard. Guess what, Tesla even detected that and stopped. This makes us think that even if no one ever hacks the digital billboards, an ad showing a similar sign could cause trouble.

In another test, they revealed a Tesla following a false speed limit flashed for a fraction of a second on a nearby signboard. Given these results, it is bad news for autonomous driving systems and should be rectified at earliest.

In a statement to Wired, one of the researchers, Yisroel Mirsky, said, “this flaw in the system makes the car react accordingly, and the driver doesn’t even know what’s happening. It is a matter of 0.42 seconds for Tesla’s Mobileye device to detect these signs that could lead to deadly accidents.

Possibility Of Creating Easy Havoc

Hacking the billboards is an easy task, and it doesn’t even leave any traces behind for the one hampering it. This factor makes the situation worse.

The research was put forward to Tesla as a concern, and the giant auto-maker came with its old reply saying its autonomous mode isn’t designed entirely to drive the car on its own and requires human attention.

However, the researchers’ team wasn’t convinced and replied to Tesla because its users take it another way around due to false marketing. As a result, Tesla users don’t pay enough attention to the road while in autonomous mode.

The mistake of taking the system as entirely autonomous has caused numerous accidents. Some of the famous examples are inclusive of a Tesla that crashed into a police car while the owner was watching a movie. In another recent incident, a drunk driver who sat in the passenger seat and his Tesla happened to crash.

In an even worse situation, Tesla users assume that sleeping while they are on the road in autonomous mode is safe, which has been a reason for most of the Tesla accidents.

The team’s research work will be presented at the ACM computers and communications security event next month. Watch the video below to learn more about the flaws of autonomous mode.

Exit mobile version