On Thursday, a consumer report surfaced that Tesla self-driving cars could easily be tricked into driving off without having anyone on the wheel. A recent incident took place in Texas where a Tesla in the absence of a driver was the reason for a fatal accident and forced the publication.
The consumer report researchers attached a weighted chain to the steering wheel of a Tesla Model Y, which acted as a driver’s hand, and then drove around on a test track for several miles.
During the test, one sat on the passenger seat, and the other took the back seat to assess the autonomous car’s behavior when the driving seat was left empty.
They found it an easy feat to trick the algorithms in the self-driving vehicle. Moreover, they gave the autonomous system a chance to detect what’s happening by not even using the drivers’ side door, and as expected, it missed that detail as well.
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” Jake Fisher, CR’s senior director of auto testing, said in a statement. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”
Professionals on a closed track conducted the test, and they emphasized that it remains this way. They said no one should conduct this test as it could lead to a fatal accident, not just harming the ones conducting it but also putting others on the road at risk of getting into an accident. The test for consumer report was conducted at speeds slower than 30 mph, and a team of professionals stood on the sides of the tracks for enhanced safety.
“Let me be clear: Anyone who uses Autopilot on the road without someone in the driver’s seat is putting themselves and others in imminent danger,” Fisher says.
Federal Crash Investigators are now assessing the Tesla accident that took place in Texas. However, Musk, Tesla’s CEO, said the recovered algorithms from the crashed vehicle show that autonomous mode wasn’t even activated.
Autopilot mode in Tesla was also proven to turn off unexpectedly in the research without setting off a warning, “Autopilot makes mistakes, and when it encounters a situation that it cannot negotiate, it can immediately shut itself off,” Fisher said. “If the driver isn’t ready to react quickly, it can end in a crash.”