Late in November, an eight-car pileup happened on the San Francisco Bay Bridge. At the time, the driver said that the use of Full Self-Driving technology in his Tesla was to blame for the accident. New footage shows exactly how the situation unfolded and it looks bad for just about everybody involved.
Nine people were injured in the crash but thankfully none of those injuries were life-threatening. Traffic was stopped for some 90 minutes as emergency responders cleared the wreckage and carerd for those injured. Again, the Tesla driver specifically said that FSD malfunctioned and caused the crash in the accident report.
Video obtained by TheIntercept shows the Tesla move from lane two to lane one and then slow down inexplicably as traffic continues to flow at a steady rate. Almost immediately, traffic in lane one, the fast lane, backs up and collisions begin.
More: Video Proves Tesla’s Full Self-Driving Beta Still Performs Terribly In The Snow
While Tesla’s AutoPilot and Full Self-Driving autonomous technologies often garner attention for their failures, it’s rare to see one behave in the way that this car did. In fact, the only time that FSD should slow down and pull over like this is when it detects that a driver has gone to sleep or when they are otherwise unresponsive. Even if that was the case in this incident, pulling over to the left lane isn’t exactly the safest course here.
To make matters worse, this was the same day that Tesla’s CEO Elon Musk triumphantly announced that all of North America would now have access to FSD. It’s unclear whether or not the driver of this Tesla had only gained access that day or had previous experience with the technology.
Regardless, Tesla does require users to acknowledge that they must remain alert and in control of the car at all times during the use of AutoPilot or FSD. Why didn’t the driver hit the accelerator when they recognized the car slowing down while on the highway? Ultimately, that would’ve likely resulted in no accident at all.
Nevertheless, it’s clear that there’s good reason for criticism towards FSD and AutoPilot. Not all drivers can be trusted to take over when they need to. It’s unsurprising then that the NHTSA has dozens of investigations surrounding the technologies.