Skip to Content
Streetsblog USA home
Streetsblog USA home
Log In
Self-driving cars

Car Companies Failed at Regulating Themselves. Why Would Autonomous Car Companies Be Different?

A video released by Tempe police from inside the Uber car shows the backup driver’s attention diverted from the road before the collision that killed Elaine Herzberg.

How safe are the autonomous vehicles being tested on American streets? The killing of Elaine Herzberg by an Uber car in self-driving mode last month has focused attention on the startling absence of public oversight of this nascent technology.

One of the main selling points of autonomous vehicles is that they will be much safer than human drivers. But Uber AVs had only logged about 3 million miles before one of the company's cars ran over and killed someone. For human drivers in the U.S., the rate is one fatality for every 86 million miles.

Other companies have better safety records than Uber, the New York Times reported last week. Waymo's self-driving vehicles average about 5,600 miles between interventions by the human back-up driver, for instance, while Uber's only average 13 miles. But all the companies competing to bring AVs to market are operating in a lax regulatory environment that lacks basic precautions. The companies have basically written their own rules.

Some reluctance to impose public oversight stems from the expectation that AVs need to learn in real-world conditions in order to become safely operable in those conditions. According to this logic, Elaine Herzberg's life was sacrificed to save other lives in the future.

That's a bad excuse for the companies to continue to avoid safety regulations, argues the Sightline Institute's Daniel Malarkey. Expanding on piece he wrote for the Seattle Times, Malarkey says a system in which AV companies make the rules is going to fail:

The auto industry has fought sensible safety standards for decades, opposing requirements for safety glass, seat belts, airbags, and catalytic converters. Volkswagen committed a massive global fraud by falsifying tests of the emissions from its diesel cars. The auto industry is at it again, working Scott Pruitt at EPA to roll back Obama-era clean car standards. We can’t trust the industry to self-regulate.

Especially since setting proper safety standards for autonomous vehicles is a hard problem to solve. For all the mayhem on our highways, human drivers now drive close to 100 million miles on average before a fatality occurs. The self-driving car industry has logged around 15 million miles of test rides on public roads and has now inflicted its first casualty while testing. Based on these numbers alone, there is a 16 percent chance that the entire self-driving fleet as tested is just as safe as human drivers and simply got unlucky with an early fatality. There’s a better than 4 out of 5 chance, if the technology’s capabilities are uniform across companies, that it is less safe than human drivers. Companies like Waymo have already argued that their technology works better than Uber’s but haven’t given us any way to prove it.

Each company developing self-driving cars would have to log billions of miles of training rides before they could prove statistically that they are substantially safer than humans. Analysts at Rand have shown that too strict a standard for proving safety performance could substantially delay deployment and cost more lives than a less strict standard that allowed the technology to deploy sooner.

The task of weighing potential benefits and costs over time in the face of uncertainty belongs with the public sector, not the companies racing to develop the technology. Despite the promise of breakthrough drugs, for example, we don’t allow pharmaceutical companies to use their own testing procedures and market a new product whenever they say it is ready. Instead, we require them to prove safety and efficacy to the Food and Drug Administration, through a multi-tiered approval process.

The federal regulatory process does pose a challenge for regulating AVs. At the typical pace of U.S. DOT's rulemaking process, imposing new regulations would take years. We can't wait that long. Self-driving car technology is developing rapidly, and the public deserves real assurances, as soon as possible, that it will be tested safely and responsibly.

Stay in touch

Sign up for our free newsletter

More from Streetsblog USA

Talking Headways Podcast: Details of Development Reform in Minnesota, Part I

Jim Kumon of Electric Housing discusses his work as a developer and urban policy educator in the Twin Cities.

April 25, 2024

Thursday’s Headlines Don’t Like Riding on the Passenger Side

Can you take me to the store, and then the bank? I've got five dollars you can put in the tank.

April 25, 2024

Study: When Speed Limits Rise on Interstates, So Do Crash Hot Spots on Nearby Roads

Rising interstate speeds don't just make roads deadlier for people who drive on them — and local decision makers need to be prepared.

April 25, 2024

Calif. Bill to Require Speed Control in Vehicles Goes Limp

Also passed yesterday were S.B 961, the Complete Streets bill, a bill on Bay Area transit funding, and a prohibition on state funding for Class III bikeways.

April 24, 2024

Under Threat of Federal Suit (Again!), NYC Promises Action on ‘Unacceptable’ Illegal Police Parking

A deputy mayor made a flat-out promise to eliminate illegal police parking that violates the Americans With Disabilities Act. But when? How? We don't know.

April 24, 2024
See all posts