When Tesla pulled the sensors from its vehicles in order to run advanced driver assistance systems with cameras only, it received a lot of pushback from experts in autonomous research. Toyota’s self-driving subsidiary, Woven Planet, thinks there may be a nugget of wisdom hidden within the strategy, though.
Although Toyota will still have multiple sensors in robotaxis and other vehicles deployed on public roads, the company believes it can collect much more data from vehicles equipped only with cameras because it can afford to put the cameras in so many more vehicles.
“We need a lot of data. And it’s not sufficient to just have a small amount of data that can be collected from a small fleet of very expensive autonomous vehicles,” Michael Benisch, VP of Engineering at Woven Planet, told Reuters. “We’re trying to demonstrate that we can unlock the advantage that Toyota and a large automaker would have, which is access to a huge corpus of data, but with a much lower fidelity.”
Read Also: Tesla Goes Radar-less: New Models To Switch To Camera-Based Autopilot
Benisch said that the cameras used by Woven Planet cost about 90 percent less than radar or LiDAR sensors. Better still, the cameras can easily be installed in fleets of passenger vehicles. Using data from those low-cost cameras could help it get more data faster and even when they are informed by data that mostly comes from cameras, the driverless systems’ performance can achieve a similar level of performance as systems run with data from high-cost sensors.
Indeed, many of the criticisms leveled against Tesla’s decision to swap over to camera systems only to run Autopilot and FSD were more about the dependability of cameras on public roads than they were about the validity of camera data. If the information coming from the cameras becomes confusing, autonomous vehicle engineers want other sensor data to make important decisions with, something that has led some to suggest that Tesla’s phantom braking issues are related to its dependence on cameras.
“Phantom braking is what happens when the developers do not set the decision threshold properly for deciding when something is there versus a false alarm,” Phil Koopman, a Carnegie Mellon University professor who focuses on autonomous vehicle safety, told The Washington Post in February. “What other companies do is they use multiple different sensors and they cross-check between them – not only multiple cameras but multiple types of sensors.”
Benisch, though, believes that someday cameras may become good enough to depend on.
“But in many, many years, it’s entirely possible that camera type technology can catch up and overtake some of the more advanced sensors,” he said. “The question may be more about when and how long it will take to reach a level of safety and reliability. I don’t believe we know that yet.”