How Uber’s Self-Driving System Failed to Brake and Avoid Killing Elaine Herzberg

The self-driving system detected Elaine Herzberg six seconds before impact, but Uber had tuned the emergency braking feature to be too insensitive to respond in time. Image: NTSB
The self-driving system detected Elaine Herzberg six seconds before impact, but Uber had tuned the emergency braking feature to be too insensitive to respond in time. Image: NTSB

The National Transportation Safety Board is out with a preliminary report into how an Uber car in self-driving mode struck and killed Elaine Herzberg in Tempe, Arizona, this March. The report doesn’t assign culpability for the crash but it points to deficiencies in Uber’s self-driving car tests.

Uber’s vehicle used Volvo software to detect external objects. Six seconds before striking Herzberg, the system detected her but didn’t identify her as a person. The car was traveling at 43 mph.

The system determined 1.3 seconds before the crash that emergency braking would be needed to avert a collision. But the vehicle did not respond, striking Herzberg at 39 mph.

NTSB writes:

According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

Amir Efrati at The Information cites two anonymous sources at Uber who say the company “tuned” its emergency brake system to be less sensitive to unidentified objects.

Arstechnica‘s Timothy Lee explains the company’s rationale:

The more cautiously a car’s software is programmed, the more often it will slam on its brakes unnecessarily. That will produce a safer ride but also one that’s not as comfortable for passengers.

Uber was in a rush to meet an internal goal of offering rides in self-driving cars to paying passengers in Arizona by the end of the year, Lee reports.

The NTSB report absolves the back-up driver, who glanced away from the road before the collision. She told investigators she was not distracted by a device but was looking at the self-driving system interface.

The report also uses some victim-blaming language. The NTSB notes that Herzberg had methamphetamine and marijuana in her system, points out she was crossing 360 feet away from the closest crosswalk, and says she “was dressed in dark clothing and that the bicycle did not have any side reflectors.”

The agency is still compiling evidence for a final report and has not assigned culpability to any party.

Yesterday, Uber announced it is suspending operations permanently in Arizona, perhaps in anticipation of the release of the NTSB report. The company, however, has not abandoned its tests of self-driving cars in Pittsburgh and San Francisco.

  • Augsburg

    This NTSB report is not surprising. Self-driving car tech would have been able to “see in the dark” and detect Ms Herzberg. Sadly, the Tempe police reviewed the case based on overly dark video that was either doctored or simply not properly processed. The police assumed the overly dark video was definitive, and ignored the other sensing technologies employed in self-driving cars. Conclusion, is law enforcement is completely unprepared to investigate crashes of self-driving cars and it was inappropriate for the Tempe police to release the results of their flawed “investigation”. By the same token, AZ Gov. Ducey and his cronies were too quick to approve the self-driving car testing, as the state and local agencies were unprepared to govern and regulate the testing to ensure safety of the public. To top things off, Uber’s lawyers reached a quick settlement with the relatives of the victim, all before the truth about Uber’s culpability was revealed.

  • gneiss

    Based on a approximate speed of 40 mph and time of 6 seconds to impact, that suggests Elaine Herzberg was a little over 100 meters away from the vehicle when it detected her. Volvo indicates that that the XC90 as a braking distance of 36 m in testing of 100 km/ to 0 km/h (62 mph), which means the vehicle was well within limits of stopping at 40 mph had it not detected her as a false positive, but instead initiated stopping the vehicle.

    Detecting and resolving false positives is one of the hardest tasks that you can conduct in software design, and while machine learning can assist, just like with humans, there will be situations where the software will fail to understand the condition that it is presented with and make the wrong call. In this case, it’s clear that the software engineers were under pressure to “tune” the false positive detection so it did not brake unexpectedly and inconvenience the occupants of the vehicle.

    Also, I am disturbed by the framing in the ArsTechnica article. Calling the error a “bug” totally ignores the fact that the way it was tuned was a deliberate decision by software engineers rather than some kind of inadvertent mistake.

    What’s also disappointing about the NTSB report is that there’s no mention of how the built environment contributed to this death. Phoenix roads are hostile to pedestrians and they’re no suggestion at all about how that contributed to this incident.

  • Downtown_Jon

    This just shows that AVs are subject to the same fundamental geometrical constraints as human-driven cars. Our entire roadway network is calibrated towards an ‘acceptable’ number of deaths each year. I don’t doubt that AVs are far, far safer than human drivers, but my sense is that if they were tuned to be as safe as they have been touted, riders would find them unacceptable.

  • thielges

    “…two anonymous sources at Uber who say the company “tuned” its emergency brake system to be less sensitive to unidentified objects.”
    This was exactly what I had hypothesized the day that this collision was announced: that the problem was due to tuning parameters in favor of performance at the expense of safety. (pats self on back 🙂
    This scenario will continue to play out over and over until there’s increased emphasis on safety over profits.

  • mx

    So the system is capable of autonomously accelerating to 43mph, but not stopping even when it knows that emergency breaking is required? That’s a murder machine, and I don’t understand why you shouldn’t go to jail for putting it on public streets.

  • Vooch

    A partial Solution – AVs should drive at 20% slower than posted speed limit.

    It’s the excessive speeds that are the killer

  • It seems that Uber has not taught its cars one of the essential driving safety techniques, “cover the brake.” Most humans are taught that if you see something ahead that may be a danger, you take your foot off the gas and place it above the break pedal but not pressing the brake yet. This means the car is gently slowing down, and if you need to break, all you need to do is push down on the break. This creates a smooth and safer ride not the ride that Uber describes as uncomfortable for passengers.

  • Coffee123

    The crazy thing is that the car wasn’t even traveling the posted speed limit. It was traveling 43 in a 35mph zone. Why are AV’s allowed to speed?

  • rohmen

    When this happened, I think people assumed the worst in terms of how the software performed largely based on Uber’s horrible track record on being ethical. Way to not disappoint, Uber.

  • User_1

    Uber is full of brilliant moves!

  • disqus_1pvtRUVrlr

    Wrong. it was travelling slower than the posted limit. That location is posted 45 MPH. This has been discussed ad nauseam.

  • disqus_1pvtRUVrlr

    Yeah, cuz speed differential is always a good idea. For better or worse, speed differential, especially on faster roads increases crash rates.

  • Coffee123

    Just double checked my sources. You’re right it was a 45 zone.

  • 94110

    Strangely, it may be that the car did just that. “The car was traveling at 43 mph. …striking Herzberg at 39 mph.” Four mph sounds like about the amount of speed you’d give up if you covered your brake for six seconds.

  • Here’s a Google map I created showing crosswalks, streetlights, etc. There is a crossover paved area that looks like it would save quite a few steps, compared to the crosswalk https://usa.streetsblog.org/2018/05/24/how-ubers-self-driving-system-failed-to-brake-and-avoid-killing-elaine-herzberg/

  • 94110

    Unfortunately, I’d think the most common “unidentified object” would be a plastic bag floating in the wind.

    That said, I’ve slammed on my brakes for what turned out to be a plastic bag, and I have no regrets over it.

    But… I thought I had read in another article that the system identified “the object” as a bike?

  • 94110

    I doubt that any police department will make the same mistake ever again. That said, there are always exciting new mistakes to me made!

  • thielges

    LIDAR, size measurement, and motion tracking should have been able to discriminate between a plastic bag and a person. And indeed it did, from the NTSB report: “As the vehicle and pedestrian paths converged, the self-driving system
    software classified the pedestrian as an unknown object, as a vehicle,
    and then as a bicycle with varying expectations of future travel path,”. Though the victim was briefly classified as “unknown object” the fact that it classified her as a vehicle and bicyclist too should have been enough to brake. And I agree with your conclusion that braking for an unidentified object is the right thing to do, even if that object turns out to be just a plastic bag floating in the wind.

  • 94110

    Interesting. I haven’t read the report yet. I’d be curious how long she was categorized as each, and what the various future travel paths were.

    So from a software point of view, “unknown object” could be a transitory state that gets applied to any new object, including floating bags, and rain.

    One bug could be taking too long to recategorize her, but it also could be that she was an unknown object for only a fraction of a second.

    Then classified as a vehicle and a bicycle, the only acceptable reason not to brake would be if the the expected future travel path showed it wouldn’t be necessary. That would be a major bug.

    If the expected future travel path showed a collision imminent, all excuses go out the window. Criminal charges should probably be considered (IANAL).

    Sounds like the answer is probably criminal negligence.

  • LazyReader

    Who rides a bike in the middle of the night, wearing all dark ensemble, no reflectors and not on the crosswalk. Someone who’s probably never biked ever…………Even if there was a driver in that Uber car she’d have gotten a fender massage regardless.

  • kevd

    6 second would not have even required Slamming the brakes.
    Just slowing and stearing.
    Jesus. A “Beep” would have alerted the driver with plenty of time to react accordingly.
    Once again we see that Uber is a despicable company.

  • kevd

    “I don’t doubt that AVs are far, far safer than human drivers”
    So far, they are not.

  • Bernard Finucane

    You obviously haven’t read the report.

  • Bernard Finucane

    We also see that the Tempe police have no interest in public safety.

  • Camera_Shy

    Not the least of which is disabling all the “self-driving” safety features when testing their “self-driving” cars…

  • Brandon

    as long as the “drivers” are absolved, even when their job is to pay attention and take over, there will be no change. if the test driver was fearful that if they killed someone they would be liable for murder they would pay better attention.

  • ExpoRider

    This information proves that Isaac Asimov was a hopeless idealist when he wrote the First Law of Robotics: “A robot may not injure a human being or, through inaction, allow a human being to come to harm.”
    Or to put it another way: “Autonomous vehicles don’t kill people, greedy capitalists kill people.”

ALSO ON STREETSBLOG