A car with the Uber self-driving prototype was involved in a fatal crash with a cyclist in Tempe, Arizona. The accident reportedly occurred when a woman with a bicycle was crossing the street at a cross walk when she was struck by the Uber self-driving prototype on autonomous mode with an Uber employee at the wheel.
The cyclist was taken to the hospital, but she later died from her injuries, according to ABC15. The police are now investigating the accident and Uber said it was cooperating with the authorities. Uber said that they are halting their test program following the accident.
This accident comes just two months after a Tesla Model S slammed into the back of a stopped firetruck in Los Angeles County. The driver apparently told the fire department the car was in Autopilot mode at the time. The crash highlighted the shortcomings of the increasingly common semi-autonomous systems that let cars drive themselves in limited conditions.
When Uber first began testing their unlicensed self-driving car program in California, the programme was shut down after footage emerged of a prototype running a red light in front of a pedestrian. The company claimed that the vehicle was being driven by the engineer at the time it ran a red light, but other reports contradicted Uber’s statement.
This debacle also raises a technical question: How is it possible that one of the most advanced driving systems on the planet doesn't see a human or an obstacle the size of a firetruck in front of it?
Tesla has been upfront about it and cautioned drivers/engineers in its manual that the system is ill-equipped to handle this exact sort of situation: “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”
Tesla's Autopilot is considered a Level 2 system, and getting to Level 4 is a long road ahead.
Critics of autonomous vehicle (AV) technology point to two facts – one, both Tesla using its multiple camera system and GM's Cruise AV with five LIDAR sensors plus an additional 21 RADAR sensors are far from making the AV system a smooth ride as proven by the incidents above.
Two, the prohibitively expensive cost involved in getting a car to a Level 4 AV technology. Level 4 requires the combination of Lidar, which can see clearly in 3-D; cameras, for color and detail; and radar, with can detect objects and their velocities at long distances. Lidar, in particular, doesn’t come cheap: A setup for one car can cost $75,000. “Developing a system that can be manufactured and deployed at scale with cost-effective, maintainable hardware is… challenging,” wrote Bryan Salesky, who heads up Ford-backed autonomous vehicle outfit Argo AI. He also laid out the other hurdles facing his team in an article on the Wired website, “Vehicles need to be able to see, interpret, and predict the behavior of human drivers, human cyclists, and human pedestrians—perhaps even communicate with them. The cars must understand when they’re in another vehicle’s blind spot and drive extra carefully. They have to know (and see, and hear) when a zooming ambulance needs more room.”
As proven by the accident in Tempe, Arizona self-driving technology is yet not intelligent enough to overcome the hurdles it will face in a real-world situation. Even with someone behind the wheel to handle emergencies, as in this case, there isn't enough assurance that nothing will go wrong. Uber’s accident proves to those claiming that self-driving technology will be a reality by 2019 or even 2025 that maybe it will be more time than that.