Tesla has begun asking its limited selection of Full-Self-Driving Beta drivers to allow footage recorded from the car’s onboard cameras to be used in the event of a crash. This footage will be “VIN-associated,” meaning it can be linked to a specific car and/or user.
Although footage from Tesla’s car cameras has been used by the company in the past, data had previously been kept anonymous, with users reassured that the video imagery could not be linked to a specific vehicle. Instead, it was used to train its machine learning systems to improve AI.
However, according to a report by Electrek, this is no longer the case for FSD Beta drivers. The permissions change comes with a new update of the software, which includes the following disclaimer: “By enabling FSD Beta, I consent to Tesla’s collection of VIN-associated image data from the vehicle’s external cameras and Cabin Camera in the occurrence of a serious safety risk or a safety event like a collision.”
See Also: Tesla Explains How It Uses Real-World Data To Make Its Cars Safer
Why the change? It’s likely that with mounting pressure from regulators and the public, Tesla may want to use video evidence as proof when their FSD system is blamed for a crash.
An increase in users could lead to a rise in incidents, as FSD Beta is now being rolled out to a broader audience, including those who had bought the system and have scored over 98 on their “Driver Safety Score” system.
Regardless of how accident footage is used, the debate over whether Tesla’s “Full-Self-Driving Beta” is named misleadingly will continue to rage on. The current system is actually classified at Level 2 autonomy, whereas for a car to be fully autonomous it should be at least Level 4.