Tesla CEO Elon Musk has long emphasized that the company’s future lies in fully autonomous driving rather than competing solely on electric vehicle production. Unlike many automakers that incorporate radar and lidar for advanced driver assistance, Tesla relies exclusively on cameras and artificial intelligence—a decision that remains both unique and controversial.
However, a recent experiment by YouTuber and engineer Mark Rober highlights the potential flaws of Tesla’s vision-only approach. In a video on his channel, Rober, a former NASA engineer, conducts a test reminiscent of a classic Looney Tunes gag: can a painted wall trick a Tesla on Autopilot into crashing? Unfortunately for Tesla, its engineers haven’t yet invented Road Runner’s ability to zip through fake tunnels—so, just like Wile E. Coyote, the car slams straight into the painted wall.
To compare, Rober also tests a lidar-equipped Lexus SUV from Luminar. Both vehicles initially perform well in stopping when a simulated child appears in front of them. However, when faced with the painted wall designed to mimic the road and sky, the differences become stark. The Lexus detects the obstacle and stops safely, while the Tesla, relying solely on its cameras, fails to recognize the deception and drives right into the wall.
“This is the first time in history we can definitively say that Tesla’s camera system would absolutely smash through a fake wall without even a slight tap on the brakes,” Rober quipped.
Tesla’s Autopilot and Full Self-Driving have made significant advancements, yet they remain under scrutiny due to numerous crashes, including fatal ones. Critics argue that relying solely on cameras, as humans do, may never be enough to surpass human safety levels.
While most drivers won’t encounter fake tunnels, this test raises concerns about what else Tesla’s AI might overlook—especially as the company moves toward fully autonomous vehicles without steering wheels or pedals.