Video of the Fatal Uber Self-Driving Car Crash Upends the Victim-Blaming Narrative

A video released by Tempe police from inside the Uber car shows the backup driver's attention diverted from the road before the collision that killed Elaine Herzberg.
A video released by Tempe police from inside the Uber car shows the backup driver's attention diverted from the road before the collision that killed Elaine Herzberg.

The first accounts from police about the self-driving Uber car that struck and killed Elaine Herzberg in Tempe, Arizona, rushed to absolve Uber and blame the victim. They described Herzberg, 49, as appearing in the roadway “out of the shadows” “like a flash” and emphasized that she was not in a crosswalk.

It’s a template that police have followed after countless pedestrian deaths caused by human drivers. Every action of the victim is conveyed in the most accusatory light, while the driver’s actions aren’t questioned at all.

As with many of those cases, now that video from the Uber car has been released, the victim-blaming narrative doesn’t hold up. The images should alarm anyone living in an area where these vehicles are being operated on public streets.

The exterior video shows Herzberg, pushing a bike to ferry belongings, had already crossed most of the street at the moment of impact — she didn’t hurl herself into the car’s path. She was outside a crosswalk, but a main selling point of autonomous vehicles is that they’re supposed to detect and safely react to such situations.

Writing at Forbes, Jim McPherson, a legal consultant on autonomous vehicle issues, says Uber’s sensors should have been able to detect Herzberg even in poor lighting. Other industry experts relayed similar conclusions to the Associated Press. Instead, the vehicle did not brake until impact.

Either there was some sort of technical failure, says McPherson, or the car was programmed not to swerve out of the lane in order to protect the vehicle and its occupants, which raises a whole host of ethical questions.

In the event of such a breakdown, a human “safety driver” is supposed to take control of the vehicle. But interior video shows the backup driver looking away from the road for significant stretches of time immediately before the crash:

It may be a year before the National Transportation Safety Board releases a detailed report on the collision, providing more definitive answers.

For now, Uber has temporarily suspended its AV testing. Other automakers have not. They may have better technology and protocols to prevent collisions, but the public has startlingly few assurances that necessary safeguards are in place.

That’s what public safety watchdogs have been trying to communicate. The regulation of AVs has been too lax as companies beta-test the technology on roads where a glitchy sensor, bad code, or momentary lapse of attention by the human back-up can easily prove fatal.

The crash in Tempe suggests that on a per-mile basis, Uber’s AV system is more dangerous than human drivers. The public should have assurance that companies are adhering to a higher standard if they’re going to test products on our streets.

319 thoughts on Video of the Fatal Uber Self-Driving Car Crash Upends the Victim-Blaming Narrative

  1. Stopping isn’t always an option.

    And voters would vote out any politician who suggests the 5 mph speed limit that would be needed to meet your irrational objective

  2. No, I already stated that I was discussing a situation where you cannot stop in time and so you have to hit something. And we need to decide in this case who we hit – a pedestrian or a speeding truck?

  3. There was no time for him to take over – the homeless woman walked right out into the path of the vehicle

  4. You need to prove your case that this vehicle could have avoided the homeless woman without taking a greater risk from other traffic. Your personal attacks merely demonstrate that you have no confidence you can make your case

  5. Impersonating nobody. Disqus allows duplicate names for very good reason. “Stuart” is the name of many people

  6. Is there any evidence that the driver could have stopped the vehicle when the homeless woman jumped out in front of the vehicle? I doubt that many human drivers could have avoided her either, without taking the risk of swerving into traffic

  7. She didn’t “jump out” in front of the car. The video is decieving since you can only see 20′ a head of the car (if that was the vis, then the car shouldn’t of been driving >20mph). The actual street is well lit and she would of been visible for hundreds of feet.

  8. Well that’s what we don’t know isn’t it? The cops appear to think she was at least partly at fault. But there may have been a flaw in the program logic as well. I’d assume that a decision had to be made whether to hit the woman or swerve left or right, both of which could have been risky.

    Just shows how hard it is to program these things.

  9. Wait until they own the sole rights to use the streets, deeded to them by cities seeking funding for infrastructure repairs “Increasing safe access to major roads for bicycles and walkers, for
    example, might offer an unexpected benefit should the robocarpocalypse
    arrive: Neighborhoods with deliberately-built, well-maintained paths for
    non-automotive transit will retain the freedom of entrance and egress
    that the car currently insures everywhere.”from Atlantic magazine article predicting this.

  10. Is there anything relevant about where she sleeps? Why the need to emphasize the “homeless woman”?

  11. Because everyone knows that homeless people behave in a more stressed and unpredictable manner. In another newsflash that I feel sure will shock you, they also abuse drugs more.

  12. Right! Let’s make assumptions about the cause based on that assumption! In other news, certain demographic groups are worse drivers so all crashes they’re in are their fault too!

  13. Correlations don’t excuse responsibility. It’s actually the definition of causation.

  14. If you outrun your vision field, then you are driving too fast for the road conditions, hence you are speeding, if we are to take the Uber video at face value. Of course, it is safe to assume that the video has been doctored. At least, it is more plausible than to believe that the dashcam i Uber vehicles is no better than a 90s webcam.

  15. Has the NTSB released any reports (or a preliminary ones) detaling the causes of the accident, containing recommendations or mandatory changes for manufacturers and operators of autonomous vehicles like FAA does after any airplane crash?

    I don’t know if it’s an Uber problem or an industry wide problem and I really don’t care.

    What I want the NSTB to be better than the FAA who after any incident publishes for the whole industry, reports with their findings, recommendations and mandatory changes. No matter matter if the airplane/s were Boeing, Airbus or Embraer, or if they were operated by United, Southwest, British Airways or Lufthansa.

    TL;DR: What I would like is that the NTSB together with the Industry analyzed any incident involving an autonomous car (crash, near misses, anything…), and released timely reports on the causes, mandatory changes and recommendations for both, makers and operators.

    Silence can’t be a choice, If we want to advance the technology.

Leave a Reply

Your email address will not be published. Required fields are marked *


The 4 Biggest Sins Committed By Reporters Covering Pedestrian Deaths

Each year, motorists on American streets kill nearly 5,000 pedestrians. The loss of life is enormous — equivalent to 12 jumbo jets crashing with no survivors — but the steady drumbeat of pedestrian fatalities doesn’t register as an urgent public safety crisis. Maybe it would seem more urgent if the press covered pedestrian deaths as the preventable […]