If Self-Driving Cars Aren’t Safer Than Human Drivers, They Shouldn’t Be on Public Streets

AV crash

More details are emerging about how an Uber self-driving car struck and killed a woman in Tempe, Arizona, raising red flags about the testing of autonomous vehicles on city streets.

Tempe police rushed to absolve Uber since the victim, Elaine Herzberg, was outside a crosswalk, according to the Phoenix New Times. But the paper’s Ray Stern reports that it’s common for people to cross midblock at that location — something a human driver may have anticipated.

Police also said the Uber car was exceeding the speed limit, traveling 38 mph in a 35 mph zone, according to the San Francisco Chronicle, and that neither the vehicle nor the person behind the wheel, who is supposed to take control to prevent collisions, engaged the brake “significantly” prior to impact.

Beyond the particulars of the crash, which is still the subject of an open investigation, transportation officials and safety advocates warn that the incident highlights the dangers of allowing autonomous vehicle testing on public streets without clear safety standards or guidelines.

Uber’s program has only logged a few million miles in self-driving mode — crossing the 2 million mile threshold in September while adding a million miles every 100 days, according to Forbes. Meanwhile, human drivers in the U.S. traveled about 86 million miles for every traffic fatality in 2016, reports the Insurance Institute for Highway Safety. (Other companies, including Tesla and Waymo, have compiled more mileage than Uber.)

Arizona has become a testing ground for autonomous vehicles in part because of its hands-off approach to regulation. And last month, the California DMV passed a rule that would allow autonomous vehicles on public streets with only a remote operator to intervene in case of emergency.

At the federal level, U.S. DOT has only issued voluntary guidelines for autonomous vehicle companies. Legislation pending in Congress right now would allow companies to not just test AVs but sell them to consumers. The bill, AV START, passed unanimously in the House but has stalled in the Senate.

Autonomous vehicles have the potential to be safer than human drivers, but these testing arrangements are proceeding with no agreed-upon safety standard to assess the technology.

The National Association of City Transportation Officials said in a statement that firm guidelines should be established because “the current model for real-life testing of autonomous vehicles does not ensure everyone’s safety.”

“In order to be compatible with life on city streets, AV technology must be able to safely interact with people on bikes, on foot, or exiting a parked car on the street, in or out of the crosswalk, at any time of day or night,” said NACTO Executive Director Linda Bailey. “Responsible companies should support a safety standard and call for others to meet one as well.”

Meanwhile, in the absence of such a standard, “the American public is serving as crash test dummies,” said Cathy Chase of Advocates for Highway and Auto Safety, an organization founded by the insurance industry and consumer watchdogs. “I see this as an urgent call to action.”

Chase wants to see an across-the-board pause on further loosening of autonomous vehicle regulations until the National Transportation Safety Board, which is investigating the Tempe crash, releases recommendations.

As the Tempe crash illustrates, detecting people walking or biking is a known weakness of self-driving cars. A number of active transportation advocacy organizations objected to the federal AV START bill on the grounds that the technology is not advanced enough to safely react to pedestrians, cyclists, or people in wheelchairs.

Advocates for Highway and Auto Safety has 12 recommendations to improve the legislation. One thing the public should insist on, says Chase, is a “vision test” for vehicles operating on public roadways. Just like a human driver would need to be able to spot distant objects and react to them, so should autonomous vehicles.

Relaying information from Tempe police, the SF Chronicle reported that Herzberg, the crash victim, was walking a bike “laden with plastic shopping bags,” implying that Uber was not at fault. But that’s the kind of scenario that arises on city streets all the time in real life.

“If an AV cannot properly react to that type of situation it should not be on the road,” Chase said.

247 thoughts on If Self-Driving Cars Aren’t Safer Than Human Drivers, They Shouldn’t Be on Public Streets

  1. Yes, and there is a sign on the lamp post that you describe: “BEGIN RIGHT TURN LANE – YIELD TO BIKES”.

  2. How can it be a crime scene? Robots cannot have malice. These vehicles are death machines. They are a legal loophole for drivers to escape moral accountability.
    How is this any different from blaming guns for mass murderers?

  3. Who knew robots couldn’t read? Anyone with half a brain cell, that is who. I suppose if you shoot a bullet into a crowd, the bullet is responsible for reading the sign that says “Gun Free Zone?” And the targets deserve to die if they cannot duck in time. This is craziness. The mayor of Tempe is morally responsible for allowing his citizens to be abused as lab rats.

  4. That is the problem. It has a camera which detects an object, not a human being. It is impossible for it to have a human reaction to the horror of killing another human being.

  5. It didn’t happen here because it is a big fat lie. No camera can react like the human eye. It is not connected to a soul.

  6. Awful. Cyclists call human beings killed by cyclists “vanishingly rare insignificant statistics” and cyclists merely frightened by cars “mother, fathers, sisters, brothers.” Now, motorists can call pedestrians killed by robots “a big learning experience.” This whole situation is dehumanizing.

  7. You had better check your tone.

    First off, you’re engaging in extreme hypocrisy by saying “don’t talk down to me” and then IMMEDIATELY going on to say I have the intelligence of a turnip.

    Not one single thing you just said counters anything I stated because I didn’t claim any of the things you are saying I did. I did not say or even remotely imply that drivers bear no responsibility for the safety of those on and around the road. I didn’t say or imply that pedestrians and cyclists are the only ones responsible for and in control of their own safety on roads. I know the ****ing laws for vehicles, pedestrians, and cyclists. That doesn’t change the FACT that pedestrians and cyclists are inherently at greater risk of injury or death in an accident in a space where they share a common thoroughfare with vehicles. That doesn’t make them automatically at fault. It doesn’t mean they deserve injury or death. It means that they will physically ‘lose’ the battle if they get hit by a car every single time, even when it is 100% the driver’s fault, and even if the driver goes to prison. That is an indisputable fact.

    Nothing you can say changes these facts. That’s the thing about them, they’re true whether or not you like them and no matter how wilfully you choose to misinterpret them.

    So next time you want to run your mouth, maybe get some damn reading comprehension skills and don’t ****ing lie and assign statements to someone who never made them in the first place.

  8. I’m not suggesting that the robot committed a crime. I’m suggesting that either the engineers and programmers at Uber are incompetent or the executives at Uber encouraged the engineers and programmers to cut corners in their haste to promote their AV technology. Criminal negligence and disregard for public safety are things, aren’t they?

  9. Any driver who scans to the side would have seen her approaching and would have been able to stop.

    Any driver who only looks straight ahead, and is therefore unable to react to approaching people, animals, and objects, is not competent to drive and should have his or her driver’s license permanently revoked.

  10. News flash: except in the remotest of settings (which aren’t the sorts of places that many pedestrians are found), every pedestrian has to rely on motorists complying with at least some remnant of the law. I have to cross multiple streets, on foot, on my way to work, most of them at signalized crosswalks. At any point while I’m crossing, a motorist might accelerate against the red light into my crosswalk. I have to cross the street, anyway, or else I won’t be getting to work.

    Pedestrians do not bear the responsibility to ensure that drivers don’t do stupid, illegal, and selfish things. Drivers do.

  11. And how long would that gap be? Long enough for an elderly or disabled person or someone with young children to make it across the road? Could slow moving pedestrians ruin the “continuous flow” if they’re not quick enough to make it? How do you account for other road users such as cyclists?

    As it stands in my city there are lights that I, a 30 year old powerlifter and cyclist with a fairly active job have trouble crossing in time without needing to get up to a bit of a jog to actually make it because everything is centered around the car. Would “constant flow” perpetuate that flawed design?

    Don’t get me wrong, the idea of AV’s is something I’m fairly excited about, once they get the tech where it needs to be. I do, however, think that the idea of “constant flow” is a pipe dream unless roads become 100% for AV’s and are no longer considered public spaces, even for crossing which isn’t feasible.

  12. The police may have mis-spoken. But there could be a crime here:

    A. If the driver was supposed to be more closely monitoring the operation, but wasn’t.

    B. If the car wan’t actually in auto mode, but was being operated by the driver, who was not paying attention.

    All that said. People are reading their predefined bias into this crash. I’ve seen the dashcam video and it looks like a somewhat competent driver or algorithm should have been able to see the person and slow down. Further, I suspect that this car would not have stopped for someone crossing in a crosswalk either. On the other hand, everyone else seems to be “dark, jaywalking and not wearing hi-vis”.

  13. Now that the video is out, the headlights look normal. Honestly, Elaine would have seen the car if she looked. However, the car software should have recognized her and at least tried to stop.

    Anyone can see a car from the front at night, as long as the lights aren’t completely off. Even with the just the markers or DRL’s on! The dimmest headlight is still brighter than an average bike light, for example.

    So many times I’ve seen drivers with their headlights off, but DRLs on, because they’re bright enough to illuminate the road in front. What they don’t realize is that their back lights are completely off with just the markers or DRLs on.

  14. The video pretty clearly shows that the headlights were normal. You can see a car coming with no issue, even when just the marker lights are on, especially with modern cars, which often have bright LED markers.

    Keep in mind, a marker lamp is usually far brighter than a typical bike light, and you can see those coming from a mile away too. having said that, again, the video shows the headlights were on.

    She simply misjudged the speed of the car.

    The rest of you comment was on point.

  15. I know it isn’t a serious transit system. But it is *sold* to us that way. And we are forced to subsidize it to a massive extent. And transit advocates are pushing for more dedicated dollars, and any and all alternatives (such as say… more traffic lanes) are shot down immediately because we have such a “world class” transit system that in reality sucks.

    You can’t have it both ways, either our multi-billion dollar world-class transit system sucks or it’s fantastic. Advocates want the best of both worlds so they claim both.

  16. You missed the point completely.

    the question was, can you see a car coming at night. The answer is an unequivocal yes, even with just the marker lights on! Having said that, the video shows the headlights were on too!

    This car was absolutely visible from the front. To claim otherwise is denying reality.

    And lay off with that victim blaming crap, I’ve mentioned, more than once, that the Uber is to blame for not even trying to stop.

  17. Yes, the video is reshot from an LCD screen so it is at least three times removed from reality before it even arrived at the media. Then it goes through the TV studio’s processing. A lot of important information was stripped away and now what appears on TVs across the country is being used to try this case in the court of public opinion.
    Well played, Uber.

  18. This is a situation where the technology (whether full autonomous or just automatic braking) is supposed to be better than a human. We have people all the time walking across roads, assuming that the drivers will be courteous and slow down for them. I bet this occurs about a 100 times an hour in the USA. Unfortunately, Uber’s technology doesn’t seem to be ready. Of course, of the flip side, once enough autonomous vehicles our on the roads, we may start becoming more careful when crossing roads and not assume their is a human driver that will be courteous to slow down.

  19. An alert driver might have noticed when the pedestrian blocked the light that was behind her, about three seconds before impact. That likely would have tipped me off that someone might have been crossing the street. Also, the chart you include is stopping time. The car didn’t have to stop. If it had just steered three feet to the left, it would have missed the pedestrian. That should have been easy for either human or AI to do.

  20. The detector systems on the vehicle FAILED TO DETECT A BICYCLE – a lrge metallic object with pleny of radar/lidar reflecting surface.

  21. “jaywaliking” ia a USA horror that does not exist elsewhere …
    The car hit a METAL OBJECT in its path – a bicycle – immediate software & detections systems FAIL

  22. indeed

    notice how this topic has vanished from media, especially since the public consensus is Über is at fault.

    Media understands who butters their bread

  23. Hello Mr. Tingey – This is a bit off track, but I have an ancestor named Thomas Tingey, born 1750 in London, who migrated to America, and became a Commodore in the U.S. Navy. Would that ring any bells with you? He’s a “dead-end” who I have tried to find out more about. Thx

  24. Another Huguenot – almost certainly a v distant cousin – there aren’t many of us ……
    Until about 1910, the name was almost entirely confined within London, apart from people like your Thomas

  25. The Cycletrain! will be the human powered monorail transit of the future. But meanwhile, there’s a lot to be said for maintaining and improving existing transit, especially buses, although human powered buses are definitely doable.

  26. Utterly fascinating, as is the fact that there have been three USS Tingey ..
    For more information …
    Most Huguenots came to the area very close to The City, specifically “Spitalfields” where some of the 18thC houses have, fortunately been saved from the wrecker’s ball ( Nearly lost the lot in the 1960-75 period.)
    See also
    and my own public-record picture of the same place

  27. this wasn’t sarcasm or ironic. From experience. Both motorists who hit me (not at the same time) said “I never saw you”,
    even though I was riding legally, visibly, in daylight, and believed I
    had made eye contact with the sausage truck driver stopped at the cross
    intersection before he rolled through it into me. Now it appears AV’s
    will be saying the same thing, perhaps in a sweeter tone. OK, that was sarcasm.

  28. Glad that you appreciate your cousins’ efforts over here in the Colonies! Thanks for the links – I will check them out and get back…

  29. I was thinking of the (Apparently switched-off or downgraded ) radar/lidar systems that Uber had deliberately disabled on the car ….

  30. for more info contact me on – remove all spaces from this & substitute as appropriate:
    f l e d e r m a u s [AT] d s l [DOT] p i p e x {DOT] c o m

  31. Uber had downgrade &/or deleted a part of the safety software … oh dear, greedy murderous cheapskates, maybe?

Leave a Reply

Your email address will not be published. Required fields are marked *