The National Highway Traffic Safety Administration has sent a letter to Tesla asking why it has not issued a recall to address software changes made to its Autopilot driver-assistance system.
In September, the automaker sent out an over-the-air update to its vehicles that was reportedly aimed at helping vehicles better detect emergency vehicles stopped on the side of the road. The update followed an investigation that NHTSA launched into Autopilot after identifying 12 crashes involving emergency vehicles.
In its recent letter, seen by Reuters, the regulator asked the automaker about the update, stating that the law requires automakers to issue a recall “when they determine vehicles or equipment they produced contain defects related to motor vehicle safety or do not comply with an applicable motor vehicle safety standard.”
Read More: NHTSA Investigating 12th Tesla Autopilot Crash With Emergency Vehicle
The NHTSA said the updates were designed to help Tesla vehicles detect flashing emergency vehicle lights in low light conditions. That suggests that the regulator believes that Tesla should have issued a recall to comply with the law, though the company has until November 1 to respond to the letter with its reasoning.
The agency also raised questions about the company’s Full Self-Driving beta that was released in October 2020. Specifically, the regulator took issue with reported limits put on disclosures by drivers using the function.
It wrote that drivers using the beta “have non-disclosure agreements that allegedly limit the participants from sharing information about FSD that portrays the feature negatively, or from speaking with certain people about FSD […] even limitations on sharing certain information publicly adversely impacts NHTSA’s ability to obtain information relevant to safety.”
This is just the latest chapter in a saga that has seen U.S. officials becoming increasingly vocal about the automaker’s practices.