A lawsuit in Florida claims that a man died as a result of using Tesla’s Autopilot because the company didn’t “fix a known issue” with the software. Multiple engineers confirmed in court documents that it wasn’t designed to avoid the accident in question. Despite that, the family blames Tesla for the death of their loved one.

This case isn’t new. It’s the same one we first told you about in May of 2019 when driver Jeremy Banner died after hitting the broad side of an 18-wheeler with his Model 3. He had engaged Autopilot 10 seconds before hitting the truck and failed to avoid the accident for reasons unknown. What’s new today is the revelation that two engineers admitted in 2021 that the software wasn’t designed to avoid such accidents.

One of the engineers is quoted by Bloomberg as saying if “there’s cross traffic or potential for cross traffic, the Autopilot at the time was not designed to detect that.” That statement and a similar one from a second engineer around the same time are being billed as condemnation that Tesla fell down on its duties. It knew at the time of Banner’s accident that a similar incident happened in 2016.

More: Out Of Control Tesla Slams Into Garage, Three Occupants Injured

 Former Tesla Engineers Claim That Autopilot Wasn’t Designed To Handle Some Situations

In that earlier crash, another man died after his Model S struck the side of a semi-truck trailer. Again, the reason why the man didn’t avoid the accident himself is unclear. When it happened Tesla released a statement with condolences to the family but also noted that “when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance.”

“There is evidence in the record that the defendant Tesla engaged in intentional misconduct and/or gross negligence for selling a vehicle with an Autopilot system which Tesla knew to be defective and knew to have caused a prior fatal accident,” the Banner family said in the amended complaint.

At this point, it seems almost as though lawyers across the nation are just throwing every argument they can find at Tesla in hopes that something finally sticks. As at least one jury has already found that it’s the driver who is at fault when a crash happens with Autopilot on. It would seem that if that holds true here then Autopilot didn’t malfunction, a driver failed to avoid a danger that any human should see.

 Former Tesla Engineers Claim That Autopilot Wasn’t Designed To Handle Some Situations