Editor’s Note: An earlier version of this story incorrectly stated in the title that this was the first accident involving a Google self-driving car. The corrected version reflects a change only in the title of this article. We also posted an updated story on the incident that includes a new video and an official statement from a Google spokesperson. You can check it out by clicking this link.
Many companies are developing self-driving cars. The latest example is VW’s TAP (Temporary Auto Pilot) that was recently demonstrated on video.
Another earlier model is Google’s automated/driverless fleet of Toyota Prius hybrids developed with the help of the military’s DARPA (Defense Advanced Research Project Agency) 2007 Urban Challenge winner, Chris Urmson and Anthony Levandowski, the maker of the world’s first autonomous motorcycle (again, for DARPA) – oh, and an self-driving pizza-delivery vehicle.
The question many skeptics raise is who would be responsible in an accident involving a self-driving car. Handing over control to a machine may sound like a good idea: the car will obey speed limits, never do anything dangerous like overtaking on the right, and the streets will be safer.
However, as a recent accident in Google’s campus showed, when a Google Prius was involved in an accident, practice is a different thing from theory.
Google’s autonomous car researcher Sebastian Thrun called the system “the perfect driving machine” and said that self-driving cars can save one million lives every year –and that’ a goal worth pursuing. Google later revealed to Business Insider that one of the cars was operated by a human, the implication being that the autonomous driving system is not the culprit.
While we’re not questioning Google’s statement, the issue remains: who exactly is responsible in the case of an accident involving a self-driving car? Since no two situations are ever the same, how can one rely on a computer to decide what’s best and for whom –the passengers of the car it controls, those of another vehicle, or a pedestrian?
What if the software crashes? What if there is a hardware malfunction? Even PCs, with decades of development behind them encounter such problems – which, while frustrating, are not life-and-death issues.
The thing is that Nevada has already been convinced by Google to legalize autonomous cars on the state’s freeways. And Google is using its Prius fleet in California, where legislature does not pose any bans.
But is anyone ready to guarantee that AI-driven vehicles can operate in a fail-safe way dramatically reducing accidents and casualties?
Story sources: Jalopnik, Business Insider & NBC
VIDEO