Uber Got Off the Hook for Killing a Pedestrian with its Self-Driving Car

AV crash

Who killed Elaine Herzberg? Not the driver of the car that ran her over — because there was no driver. And therein lies a problem.

Arizona prosecutors’ decision to not criminally charge Uber for the March self-driving-car death of Herzberg signals that tech companies won’t be punished for taking egregious risks with their untested technology even when the worst happens — in this case, the crash death of Herzberg, a homeless woman in Tempe who became the first person killed by a self-driving car.

Uber has already settled a civil case with Herzberg’s family. And the National Transportation Safety Board has yet to release its full findings of an ongoing investigation. And local authorities say that the car’s “backup driver” may still be charged with vehicular manslaughter because she was watching “The Voice” on her phone when the car hit Herzberg. She did not hit the brakes until after the collision.

But Uber isn’t blameless; a preliminary report by NTSB found that Uber had deactivated the car’s emergency braking system. And that decision comes down to money. Self-driving cars can be programmed to brake whenever there is an object that the computer system can’t identify, which in tech jargon is called an “edge case.” But programming the car that way can make the journey jerky and nauseating. Uber was in a rush to start its self-driving taxi service that summer, so it had programmed the car to take chances.

Indeed, the car’s detection system had noticed Herzberg six seconds before the collision. But it did not brake because it read her shape — she was pushing a bicycle with bags — as benign.

Advocates for Herzberg — and, indeed, future victims of autonomous vehicles — were disappointed in how the Yavapai County Attorney handled the case

“How is it that … Uber can experiment with potentially deadly technology in public, but two women are paying the price for it?” Jim McPherson, an attorney who works in the transportation sector and writes about self-driving safety issues, told Streetsblog.

The National Highway Transportation Safety Administration and Congress have taken a hands-off approach to regulating self-driving car companies. There are no special regulations to ensure the safety of these vehicles at this time.

Instead, the feds are relying on companies to police themselves — and the Herzberg case is a good example of the problem with that approach. The District Attorney’s decision to not hold Uber criminally responsible is reminiscent of how law enforcement officials rarely hold drivers accountable for crashes under the outmoded and car-friendly belief that it was just “an accident” — as if no one had any agency at all.

Uber suspended its self-driving vehicle operation following Herzberg’s death, but plans to resume operations in some cities this summer, according to reports.

48 thoughts on Uber Got Off the Hook for Killing a Pedestrian with its Self-Driving Car

  1. Wow, can we leave the occupant’s gender and prior convictions out of this reporting?
    You haven’t really explained how these are at all relevant to the story.
    I expect better from Streetsblog.

  2. Not being a lawyer, so bear with a potentially stupid question: Would it be possible for the “backup driver” being charged for “vehicular manslaughter” to sue Uber about this?

    Also, would it be possible to appeal to the DA’s decision at a higher level court?

  3. Until cars are approved for self driving without a backup driver, the backup driver is the driver. The self driving (testing) functionality is no more than a glorified driver assistance. Maybe these companies should be require to have a camera pointed at their backup drivers and shut down the self driving if the backup drivers are not paying attention.

  4. this decision is mind-boggling! See a benign shape and just go ahead and runover it? And then get away with man-slaughter in what sounds more like a negligent homocide?

  5. who is speaking for the dead victim? Where is her family in this? Shouldn’t they be pressing for more action on this decision?

  6. This decision shows all Arizona residents living in self driving cars pilot testing areas, are at risk.

  7. Being reckless with people’s lives and then killing them should result in some legal liability.

  8. Her family that already sued Uber over it? I’ll hazard a guess they signed NDAs preventing them from discussing the case in public, in return for the settlement they received in their civil case.

  9. I am a lawyer, so…

    “Would it be possible for the ‘backup driver’ being charged for ‘vehicular manslaughter’ to sue Uber about this?”

    Highly unlikely, and all the more so considering the driver’s own negligence could readily be argued by Uber (or others) as the single-biggest contributor to the pedestrian’s death. Yes, the car’s AV system failed, but that’s literally the entire point of having backup drivers in the first place. And in this case she was engaging in prima facie neglect caught on camera, namely watching “The Voice” instead of the road.

    “Also, would it be possible to appeal to the DA’s decision at a higher level court?”

    No: only court decisions can be appealed in this nature, not a prosecutor’s decision to not pursue a given case. The only exception would be egregious prosecutorial misconduct, and merely making a judgment call outside observers take issue with doesn’t qualify as such.

  10. Speaking as a lawyer: like it or not, the bar for charging a corporation with criminal negligence is extremely high, and it was clear from day one that the Arizona prosecutors would find it problematic to do so – all the more so given that the whole point of having a safety driver in the vehicle in the first place was to prevent accidents like this one. A prosecutor would also have to prove that Uber acted wantonly and recklessly – and again, merely shutting off the car’s “edge case” sensors isn’t enough on its own to qualify as such.

    Further, such a claim could be pursued in court only if Uber was (provably) the primary tortfeasor. In this case one could plainly argue the driver was more negligent than Uber, and even that city and state officials were in significant part negligent in granting Uber an operating permit on public roads for such woefully undertested software in the first place.

    Btw I realize everybody wants to “hate Uber,” but if you want to take aim at gross miscarriages of justice, there are vastly bigger examples. Anyone know offhand how many thousands of people have been maimed or killed by the Takata airbags installed in over 20 million cars – despite the company knowing they were faulty? (and possibly some auto manufacturers as well) Hell, even today’s crash of a second Boeing 737-MAX on an Ethiopian Airlines flight – killing nearly 140 people instantly – is almost certainly a software-related failure, and one Boeing plainly should’ve known about long before delivering their flying deathtraps to commercial buyers.

  11. “….Self-driving cars can be programmed to brake whenever there is an object that the computer system can’t identify, which in tech jargon is called an “edge case.” But programming the car that way can make the journey jerky and nauseating…..”

    Imagine any other industry being so arrogant that it believes a minuscule addition of comfort is worth killing people ?

    Now imagine a society so depraved that it agrees ?

  12. last i read that lady was crossing a busy road illegally and literally walked out in front of the moving car. the reason the car didn’t stop was some bug in the way she was walking or something.

    i don’t see how they will get a conviction

  13. One of the fundamental issues at play is that the safety drivers were given a pretty much impossible task. Human beings are simply not wired to be alert and ready to jump in 100% of the time in situations where our attentiveness is actually needed less than 1% of the time.

  14. The car detected the presence of the victim 6 seconds before impact, plenty of time to hit the brakes. Yet the car was configured to ignore this particular detection.

  15. Impossible? Seriously? The driver was watching “The Voice” on her phone when the car killed a pedestrian! And are you even aware how many thousands of miles truck drivers typically drive each week, mostly without incident? (let alone maiming and slaughtering cyclists – and in vastly bigger & more difficult-to-control vehicles at that)

    While yes, it’s impossible to remain alert 100% of the time, I’d expect a test driver to at least be as alert as she’d be if she was behind the wheel of a non-autonomous vehicle – far more than 1%, in other words. Watching TV on one’s smartphone instead of paying attention to the wheel is practically the definition of neglect – and note that prosecutors have not decided whether to individually charge the driver for it. Based on everything we’ve seen thus far — most of all the recording of the collision, showing the vehicle had ample time to hit the brakes or swerve before killing an innocent bystander (even with a human driving it) — I think prosecutors would have a considerably more solid case.

    Finally, there’s been no reasonable suggestion Uber’s test drivers were “overworked,” even if they reported boredom from hours spent behind the wheel.

  16. How about a sane response like beginning a discussion about when do we end the horrid 100 year American experiment with private motor vehicles?

  17. I’m not sure why you’re blaming “the industry” when this problem is limited to a single company that had no business conducting early-stage testing on public roads in the first place (or arguing “society agrees” with Uber’s policy here). Waymo’s vehicles have driven tens of millions of miles with zero serious incidents, let alone fatalities. (And btw Waymo’s cars are already many times safer than ones driven by humans, even in their beta-testing phase.) Ditto every other company testing AVs – and FYI there are at least two dozen of them around the world, even if Uber and Waymo garner nearly all of the publicity / media attention in the area.

  18. If a driver had hit and killed a person who was illegally in the roadway they would not have been charged either.

  19. Please feel free to outline which part of my comment is lacking in “sanity.” Please also feel free to explain why you apparently think private motor vehicles are an “American experiment,” considering they’re commonplace in all developed countries.

    Btw I don’t disagree about the deleterious effects automobiles have produced, but the irony of your comment is that autonomous vehicles will likely replace most private vehicle ownership once the technology is perfected. (It’ll also likely be deployed on most, if not all, public transit systems, for that matter.)

  20. They are cashing fat settlement checks in exchange for shutting up. All after letting her live on the street for YEARS.

  21. Blame the victim much? A “normal” driver could, and would, still be charged if they could’ve readily avoided a fatal collision but for their own gross negligence – which by all appearances is what happened here. You don’t get off scot-free for killing jaywalkers simply because they’re jaywalking.

  22. I’m not sure about Arizona, but in California you’re responsible for avoiding hitting pedestrians regardless of whether or not they were legally in the right about where they chose to start crossing. It’s not like jaywalking laws always remove 100% legal responsibility from drivers (whether prosecutors will actually go after the drivers in those sorts of situations is obviously another story, though).

  23. More likely the configuration of the software, not a physical part of the car. When categorizing things that the sensors “see” there’s always a gray zone where some sensor results cannot be confidently categorized. The system then has the choice of treating the readings as an obstruction to avoid (i.e. hit the brakes) or ignore the results under the assumption it is a false reading. Uber chose to ignore in this case. Customer comfort over safety.

  24. I read that two drivers were used in self driving test cars, because inevitably a solo “driver” would nod off. Although maybe that was also to qualify for car pool lanes.

  25. The experts in Three Revolutions say trains first, freight trucks on “freeways” soon,with drivers taking over on off ramps, then maybe buses on freeways and probably decades before they can handle all the variables of car driving, weather, other road users.

  26. the algorythm probably recognized her as a homeless cyclist and gambled that any legal settlement would be negligible. Now if she’d been using a baby carriage for her belongings the brakes would have slammed, and driver would have been snapped out of the Voice.

  27. But there was a driver, albeit a “back-up” or “safety driver,” who was watching a video at the time. From Planetizen:
    “The incident placed the spotlight not only on self-driving technology but the human “testers” that are supposed to perform as a safety check should the autonomous technology fail.”

    “In June, the Tempe Police Department released a report that said the safety driver was streaming the television show “The Voice” on her phone in the minutes leading up to the crash, The Arizona Republic reported.”

  28. Not blaming the AV industry, I am condeming our entire society that believes killing innocents is a worthy exchange for a bit of driving comfort.

  29. Okay, but I still think you’re distorting the matter (though to be fair the article is as well): AVs are intended to save lives, not take them. Last year over 40,000 people were killed by cars in the U.S. alone, including 6,000+ pedestrians. Over 90% of these accidents (more like 95%, actually) were preventable and a result of driver error of some kind.

    Despite the Uber fatality, AVs are already significantly safer than a human-driven automobile on whole — or at least Waymo’s vehicles are (but they’re many years ahead of Uber in terms of development). This is despite the fact that they can’t be driven yet without a safety driver behind the wheel — and in this case I mean someone who’s actually paying attention to the road as opposed to her cellphone.

    This has nothing to do with the inaccurate suggestion contained in the article that people support driving “edge case” AVs because passenger “comfort” would otherwise decline. I’d hazard a guess that most people feel the opposite, actually, and like me think Uber never should’ve been allowed to test its early prototypes on public roads. I have no problem with Waymo doing so, but only because they were tested on closed roads for upwards of a decade before culminating real-world testing: indeed, Waymo has yet to report any serious accident in 15 million miles of testing, and the few it has been involved in were cases in which another driver was at fault.

  30. Internet jobs increasingly becoming a trend in all over world these days. The latest survey tells us higher than 77% of people are working for on-line job opportunities at their house without having problems. Everyone likes to spend time every day with his/her good friends by getting out for any beautiful place in the country or any other country. So internet based earning enables you to carry out the work at any time you want and enjoy your life. But finding the right direction moreover setting a proper aim is our ambition in direction of success. Already most of the people are bringing in such a solid income of $32000 weekly with the help of highly recommended as well as successful strategies of generating income online. You can begin to get paid from the 1st day as soon as you browse through our web site. >>>>> https://iplogger.org/2L3Cj5

  31. This judgement completely misses the mark. The fact is, most of the “self-driving” capabilities of the vehicle were disabled, so as not to make the ride experience for the customers uncomfortable. If Uber did not tell the human “driver” that self-driving features were disabled, then Uber is responsible for the death.

  32. And how do we know this? Maybe the programming fault would have caused the car to go faster? Smells like a double-standard to me! Got money? Problem goes away. Got no money? You’re screwed!

  33. Because the software didn’t recognize the object properly it told the car to ram it and self-destruct the car? 6 seconds is an eternity for an android.

  34. The CEO of Uber *at the time of the killing* was Travis Kalanick, who is well documented to be a compete jackass with no honor, and has a long record of crimes such as concealing rapes, writing software to hide from the police, stealing workers’ wages, promoting sexual harassment, etc.

    Kalanick has been removed as CEO.

  35. Trains are already automated (Vancouver Skylink, Docklands Light Rail).

    It’s much harder to automate anything on the roads because the roads are wild and crazy. Train tracks are consistent and reliable.

  36. This “victim” of a driver was literally watching a TV show on her smartphone instead of paying attention to the road at the time of the collision. If she had she wouldn’t have run over and killed a pedestrian; as the videos of the incident make clear, she had ample room to steer clear.

    While yes, pedestrians who stray into the roadway in situations where a driver can’t stop in time to avoid hitting them are arguably contributory in their negligence, that wasn’t the case here.

Leave a Reply

Your email address will not be published. Required fields are marked *