Skip to Content
Streetsblog USA home
Log In
Self-driving cars

Car Companies Failed at Regulating Themselves. Why Would Autonomous Car Companies Be Different?

12:06 PM EDT on April 2, 2018

A video released by Tempe police from inside the Uber car shows the backup driver’s attention diverted from the road before the collision that killed Elaine Herzberg.

How safe are the autonomous vehicles being tested on American streets? The killing of Elaine Herzberg by an Uber car in self-driving mode last month has focused attention on the startling absence of public oversight of this nascent technology.

One of the main selling points of autonomous vehicles is that they will be much safer than human drivers. But Uber AVs had only logged about 3 million miles before one of the company's cars ran over and killed someone. For human drivers in the U.S., the rate is one fatality for every 86 million miles.

Other companies have better safety records than Uber, the New York Times reported last week. Waymo's self-driving vehicles average about 5,600 miles between interventions by the human back-up driver, for instance, while Uber's only average 13 miles. But all the companies competing to bring AVs to market are operating in a lax regulatory environment that lacks basic precautions. The companies have basically written their own rules.

Some reluctance to impose public oversight stems from the expectation that AVs need to learn in real-world conditions in order to become safely operable in those conditions. According to this logic, Elaine Herzberg's life was sacrificed to save other lives in the future.

That's a bad excuse for the companies to continue to avoid safety regulations, argues the Sightline Institute's Daniel Malarkey. Expanding on piece he wrote for the Seattle Times, Malarkey says a system in which AV companies make the rules is going to fail:

The auto industry has fought sensible safety standards for decades, opposing requirements for safety glass, seat belts, airbags, and catalytic converters. Volkswagen committed a massive global fraud by falsifying tests of the emissions from its diesel cars. The auto industry is at it again, working Scott Pruitt at EPA to roll back Obama-era clean car standards. We can’t trust the industry to self-regulate.

Especially since setting proper safety standards for autonomous vehicles is a hard problem to solve. For all the mayhem on our highways, human drivers now drive close to 100 million miles on average before a fatality occurs. The self-driving car industry has logged around 15 million miles of test rides on public roads and has now inflicted its first casualty while testing. Based on these numbers alone, there is a 16 percent chance that the entire self-driving fleet as tested is just as safe as human drivers and simply got unlucky with an early fatality. There’s a better than 4 out of 5 chance, if the technology’s capabilities are uniform across companies, that it is less safe than human drivers. Companies like Waymo have already argued that their technology works better than Uber’s but haven’t given us any way to prove it.

Each company developing self-driving cars would have to log billions of miles of training rides before they could prove statistically that they are substantially safer than humans. Analysts at Rand have shown that too strict a standard for proving safety performance could substantially delay deployment and cost more lives than a less strict standard that allowed the technology to deploy sooner.

The task of weighing potential benefits and costs over time in the face of uncertainty belongs with the public sector, not the companies racing to develop the technology. Despite the promise of breakthrough drugs, for example, we don’t allow pharmaceutical companies to use their own testing procedures and market a new product whenever they say it is ready. Instead, we require them to prove safety and efficacy to the Food and Drug Administration, through a multi-tiered approval process.

The federal regulatory process does pose a challenge for regulating AVs. At the typical pace of U.S. DOT's rulemaking process, imposing new regulations would take years. We can't wait that long. Self-driving car technology is developing rapidly, and the public deserves real assurances, as soon as possible, that it will be tested safely and responsibly.

Stay in touch

Sign up for our free newsletter

More from Streetsblog USA

Thursday’s Headlines Are Inside Out

Cars and trucks are getting safer for drivers and passengers, but people outside the vehicles are increasingly in danger.

September 28, 2023

New Federal Committee Will Push for Transportation Equity By Helping DOT Reckon With Its Past

“No one alive today is necessarily responsible for the origins of the [transportation] inequities that we inherited. But everybody who was alive today and in a position of responsibility, is accountable for what we do about it. That's why we're here.” 

September 28, 2023

Report: America’s Historic Bike Boom is Flatlining

"This growth won't continue forever without being facilitated by more infrastructure investment, [and particularly] safety infrastructure."

September 28, 2023

Wednesday’s Headlines Ask How Much a Life Is Worth

There isn't much of a financial penalty for drivers who kill pedestrians — even if those drivers are cops.

September 27, 2023

‘I’m Not Grieving Alone’: New Play Explores a Father’s Journey After Losing Two Children to Traffic Violence

Colin Campbell and his wife Gail Lerner lost both their children in a car crash with impaired driver. A new play explores how to talk about similar tragedies.

September 27, 2023
See all posts