Skip to Content
Streetsblog USA home
Streetsblog USA home
Log In
Self-driving cars

Car Companies Failed at Regulating Themselves. Why Would Autonomous Car Companies Be Different?

A video released by Tempe police from inside the Uber car shows the backup driver’s attention diverted from the road before the collision that killed Elaine Herzberg.

How safe are the autonomous vehicles being tested on American streets? The killing of Elaine Herzberg by an Uber car in self-driving mode last month has focused attention on the startling absence of public oversight of this nascent technology.

One of the main selling points of autonomous vehicles is that they will be much safer than human drivers. But Uber AVs had only logged about 3 million miles before one of the company's cars ran over and killed someone. For human drivers in the U.S., the rate is one fatality for every 86 million miles.

Other companies have better safety records than Uber, the New York Times reported last week. Waymo's self-driving vehicles average about 5,600 miles between interventions by the human back-up driver, for instance, while Uber's only average 13 miles. But all the companies competing to bring AVs to market are operating in a lax regulatory environment that lacks basic precautions. The companies have basically written their own rules.

Some reluctance to impose public oversight stems from the expectation that AVs need to learn in real-world conditions in order to become safely operable in those conditions. According to this logic, Elaine Herzberg's life was sacrificed to save other lives in the future.

That's a bad excuse for the companies to continue to avoid safety regulations, argues the Sightline Institute's Daniel Malarkey. Expanding on piece he wrote for the Seattle Times, Malarkey says a system in which AV companies make the rules is going to fail:

The auto industry has fought sensible safety standards for decades, opposing requirements for safety glass, seat belts, airbags, and catalytic converters. Volkswagen committed a massive global fraud by falsifying tests of the emissions from its diesel cars. The auto industry is at it again, working Scott Pruitt at EPA to roll back Obama-era clean car standards. We can’t trust the industry to self-regulate.

Especially since setting proper safety standards for autonomous vehicles is a hard problem to solve. For all the mayhem on our highways, human drivers now drive close to 100 million miles on average before a fatality occurs. The self-driving car industry has logged around 15 million miles of test rides on public roads and has now inflicted its first casualty while testing. Based on these numbers alone, there is a 16 percent chance that the entire self-driving fleet as tested is just as safe as human drivers and simply got unlucky with an early fatality. There’s a better than 4 out of 5 chance, if the technology’s capabilities are uniform across companies, that it is less safe than human drivers. Companies like Waymo have already argued that their technology works better than Uber’s but haven’t given us any way to prove it.

Each company developing self-driving cars would have to log billions of miles of training rides before they could prove statistically that they are substantially safer than humans. Analysts at Rand have shown that too strict a standard for proving safety performance could substantially delay deployment and cost more lives than a less strict standard that allowed the technology to deploy sooner.

The task of weighing potential benefits and costs over time in the face of uncertainty belongs with the public sector, not the companies racing to develop the technology. Despite the promise of breakthrough drugs, for example, we don’t allow pharmaceutical companies to use their own testing procedures and market a new product whenever they say it is ready. Instead, we require them to prove safety and efficacy to the Food and Drug Administration, through a multi-tiered approval process.

The federal regulatory process does pose a challenge for regulating AVs. At the typical pace of U.S. DOT's rulemaking process, imposing new regulations would take years. We can't wait that long. Self-driving car technology is developing rapidly, and the public deserves real assurances, as soon as possible, that it will be tested safely and responsibly.

Stay in touch

Sign up for our free newsletter

More from Streetsblog USA

The Real Reason America Can’t Have The Tiny Japanese-Style Cars Trump Says He Wants

Trump is right that kei cars are super-kawaii — but he's wrong that clearing the regulatory decks is enough to bring them to U.S. shores.

December 16, 2025

Tuesday’s Headlines Were So Much Older Then, We’re Younger Than That Now

Getting around without driving can be tough for anyone, but particularly seniors and children.

December 16, 2025

Boston’s New ‘CharlieCard’ Raises Privacy Issues in an Age of High-Tech Tracking

The new CharlieCard provides several benefits, but riders should also be aware of the military vendor that's operating the new system.

December 15, 2025

Ride E-Scooters, Do Crime? Study Explores Relationship Between Micromobility and Vehicle Offenses

"I suspect there are confounding factors that make the link from e-scooters to crime spurious."

December 15, 2025

Find Out Exactly How Much Downtown Highways Cost Your City

"How much does it actually cost to be car dependent?" This Dallas-based analyst set out to answer that question for cities across the U.S.

December 15, 2025

Monday’s Headlines Are Under Repair

The Biden administration's Reconnecting Communities program received $14 billion in requests for $1 billion total funding. A new bill would greatly expand it.

December 15, 2025
See all posts