Self-driving cars have yet to mature into the dependable personal taxis that were envisioned. Indeed, autonomous driving technology, such as Tesla self-driving, has repeatedly demonstrated to be plagued with flaws.
Recently, the Dawn Project’s safety test, led by Dan O’Dowd, revealed that Tesla Full Self-Driving will collide with a child mannequin in a stroller.
The Dawn Project conducted the safety tests at a public parking lot in Santa Barbara, California, in October 2022, using the most recent version of Tesla Full Self-Driving Beta software available at the time of testing, Tesla Full Self-Driving Beta 10.69.2.2.
The Dawn Project established that a car would drive over a stroller in a parking lot after several tests. Furthermore, if a baby – tested using a mannequin — was seen in the stroller, the self-driving AI would still hit it.
The test was also transferred from grocery parking lots to public roadways. Even so, the Tesla self-driving AI would hit the stroller in the center of the road.
On each run, the internal camera shows the viewer that Full Self-Driving is engaged. A flashlight was also used to demonstrate that no accelerator pedal was pushed during any safety checks.
The only fault message displayed on Tesla’s display during the testing was “Supercharging Unavailable: Add a payment method to your Tesla account.”
“Tesla Full Self-Driving represents a potentially lethal threat to child pedestrians,” the outlet wrote.
“Our tests were conducted in real-world driving scenarios on public roads, highlighting further the immediate and real danger posed to child pedestrians by Elon Musk’s dangerous and defective Full Self-Driving software.”