The passenger in a 2022 crash, which tragically claimed the life of a Tesla engineering recruiter behind the wheel of a Model 3, has alleged that the driver was utilizing “Full Self-Driving” (FSD) when the vehicle careened off the road.

The fiery crash that claimed Hans von Ohain’s life occurred near Denver in 2022 as he and his friend, Erik Rossiter, were returning home after a game of golf. Both men’s blood alcohol levels were over the legal limit. Rossiter’s claim that FSD was engaged raises questions about the system’s safety and the way drivers interact with it.

Read: Tesla Autopilot Recall Fix Sparks Complaints From Both Owners And Regulators

Von Ohain’s widow, Nora Bass, described her husband as a true believer in both Tesla and its CEO, Elon Musk, in a recent interview with The Washington Post. She claimed that he used FSD at every opportunity he had, though she found it too jerky and unsettling to use herself.

That description was echoed by Rossiter, who stated that his friend used FSD both on the way to and from the golf course. He added that the ride there was “uncomfortable” due to the car driving jerkily, and von Ohain had to repeatedly correct its course as the Model 3 struggled to navigate the winding roads outside of Denver, Colorado.

Challenges Surrounding FSD and Data Recovery

Rossiter described his memory of the accident as spotty. While NHTSA documents confirm that Tesla’s driver assistance systems were engaged 30 seconds prior to the accident, there is little more official data available. The car was too badly burned in the accident to recover data, and von Ohain was driving in too rural an area for over-the-air data to confirm an FSD timeline.

In fact, official data cannot even confirm whether it was FSD or Autopilot that was engaged, and the distinction may be meaningful. Although investigations have been conducted regarding the use of Autopilot in accidents with victims, Tesla maintains that no fatal accidents have been recorded while FSD was engaged.

 Tesla Employee Killed In Crash Had Full-Self Driving On, Claims Passenger

That has helped the automaker justify the technology’s use on public roads, despite still officially being in “beta” mode. Tesla points to America’s notably large number of on-road fatalities as evidence that driver assistance technologies, such as FSD, need to be developed as quickly as possible, and therefore should continue to be allowed on its roads, even before they are officially out of beta testing.

However, the nature of FSD, as a Level 2+ driver assistance system, and Tesla’s insistence that it is a “full self-driving” system, make it prone to misuse, according to critics. Indeed, Bass said that she thinks the automaker bears some responsibility for her husband’s death, despite his inebriation.

“You’re told that this car should be smarter than you, so when it’s in Full Self-Driving, you relax,” she told The Washington Post. “Your reaction time is going to be less than if we were not in Full Self-Driving.”

However, due to von Ohain’s level of intoxication, she stated that she has been unable to find a lawyer to take his case to court. Nonetheless, experts agree that consumers need to be aware that their vehicle cannot yet fully drive for them. Others, like Arizona State University professor of advanced technology transitions, Andrew Maynard, are even more direct and claim that FSD “isn’t quite ready for prime time yet.”

Criticism of Tesla’s Handling and Support

Tesla’s stance on FSD isn’t the only aspect drawing criticism. Von Ohain’s parents and his widow expressed disappointment with the way the company treated them following his loss. Bass stated that the company’s silence regarding her husband was almost cruel, and the first communication they received from Tesla was a termination notice addressed to the deceased von Ohain.

 Tesla Employee Killed In Crash Had Full-Self Driving On, Claims Passenger