Tesla’s “Full Self-Driving” (FSD) assistance technology has been accused of causing an eight-car crash on Thanksgiving Day (Nov. 24). The incident occurred on I-80 in the Bay Area of California, leaving one person hospitalized and eight others with minor injuries.
The California Highway Patrol report on the incident says that a Tesla Model S traveling on I-80 at 55 mph crossed several lanes of traffic and then slowed abruptly to just 20 mph, at which point it triggered the crash as other cars still traveling at highway speed had no chance to avoid the slow-moving electric vehicle. The driver blamed the crash on the controversial “full self-driving” system, which he claimed, “malfunctioned, but police were unable to determine if the software was in operation or if his statement was accurate.”
It appears that the police will be unable to resolve this matter. According to a CHP representative, “it would not identify if “Full Self-Driving” was activated; Tesla would have that information.” As Tesla expanded its beta program for FSD, the system was implicated in an increasing number of crashes, and the automaker was forced to issue a recall in late 2021 for cars running firmware linked to so-called “phantom braking” events, in which bad software improperly triggered the cars’ automatic emergency braking systems.
The National Highway Traffic Safety Administration (NHTSA), which is investigating Tesla after reports of braking “without warning, at random, and often repeated in a single drive,” did not immediately comment on the San Francisco crash. NHTSA upgraded the investigation to what it calls an engineering analysis. The chair of the National Transportation Safety Board, Jennifer Homendy, has questioned if “full self-driving” is an accurate description of the technology—and said Tesla must do more to prevent misuse.