Following the death late Sunday of a woman who was struck and killed by a self-driving Uber car in Arizona, we now know more about the circumstances of the tragedy, which has a chance to shake public trust in autonomous car technology.
The deceased has been identified as Elaine Herzberg, 49, and according to police at a press conference Monday, she was walking her bicycle outside of the crosswalk while crossing the street in Temple, Ariz. According to a San Francisco Chronicle report Tuesday, the collision took place after Herzberg "abruptly walked from a center median into a lane of traffic."
Sylvia Moir, Temple's police chief, told the Chronicle that, having viewed videos from the car's cameras, "it's very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway."
Moir went on to predict that for that reason, Uber would be unlikely to be found at fault for the crash, although she wouldn't rule out charges against the car's test driver.
Uber, however, announced Monday that it was pulling testing of self-driving cars, effective immediately.
The tragedy is the first known pedestrian fatality involving a self-driving car, although a test driver for Tesla's autopilot technology was killed in June of 2016. Uber's self-driving cars had driven 2 million miles of testing, as of December, and one million of those had been driven in the previous 100 days, Forbes reported at the time. So Uber's self-driving cars have now recorded one fatality in around 3 million miles, whereas the current U.S. rate is 1.16 people for every 100 million miles driven, according to figures reported by Wired.
Apple is ramping up its own self-driving car initiative, with up to 45 such cars on the road in California, https://www.ft.com/content/2866ce9e-2bfd-11e8-9b4b-bc4b9f08f381">the Financial Times reported Tuesday.
47 Comments
As much I despise this company all the preliminary evidence seems to not put Uber at fault. There is plenty of data and this is the first pedestrian death so I'm sure it will be analyzed excessively. Hopefully something good comes out of it.
The June 2016 Tesla death was not a "test driver", it was a customer using Tesla's Autopilot feature who was NOT paying attention to the road. A semi was turning across the street he was driving down. Anyone looking at the road would have seen this and taken over control, but for whatever reason, this driver did not and his car went under the trailer, killing the driver.
Autonomous driving is still in it's early days, which is why every company so far that has released the feature to customers has been very clear about drivers remaining alert and vigilant to the environment, ready to take over control when needed.
There is no compelling reason to let Uber, Apple, GM or anyone else put beta software on public streets and put lives at risk.
It doesn't matter what the reality was. It matters what the perception is and the perception of the masses is that a self-driving car killed someone. I bet most people don't even realize that there was a person in the car.
As Colbert used to say on his old show, it has "truthiness". The masses are inclined to believe that self-driving cars (and robots) are bad and I believe the press will play this up in the forthcoming years because it's a populist message that's easy to understand.
It also doesn't matter that 37,000 people die in traditional car crashes in the U.S. each year and it won't matter if it's proven that self-driving cars result in fewer deaths. In the public's mind, every self-driving car crash is the equivalent of at least a thousand "regular" car deaths.
Wait until the first death from a completely un-manned car. I bet a mob destroys the car.