Car Companies Failed at Regulating Themselves. Why Would Autonomous Car Companies Be Different?

A video released by Tempe police from inside the Uber car shows the backup driver's attention diverted from the road before the collision that killed Elaine Herzberg.
A video released by Tempe police from inside the Uber car shows the backup driver's attention diverted from the road before the collision that killed Elaine Herzberg.

How safe are the autonomous vehicles being tested on American streets? The killing of Elaine Herzberg by an Uber car in self-driving mode last month has focused attention on the startling absence of public oversight of this nascent technology.

One of the main selling points of autonomous vehicles is that they will be much safer than human drivers. But Uber AVs had only logged about 3 million miles before one of the company’s cars ran over and killed someone. For human drivers in the U.S., the rate is one fatality for every 86 million miles.

Other companies have better safety records than Uber, the New York Times reported last week. Waymo’s self-driving vehicles average about 5,600 miles between interventions by the human back-up driver, for instance, while Uber’s only average 13 miles. But all the companies competing to bring AVs to market are operating in a lax regulatory environment that lacks basic precautions. The companies have basically written their own rules.

Some reluctance to impose public oversight stems from the expectation that AVs need to learn in real-world conditions in order to become safely operable in those conditions. According to this logic, Elaine Herzberg’s life was sacrificed to save other lives in the future.

That’s a bad excuse for the companies to continue to avoid safety regulations, argues the Sightline Institute’s Daniel Malarkey. Expanding on piece he wrote for the Seattle Times, Malarkey says a system in which AV companies make the rules is going to fail:

The auto industry has fought sensible safety standards for decades, opposing requirements for safety glass, seat belts, airbags, and catalytic converters. Volkswagen committed a massive global fraud by falsifying tests of the emissions from its diesel cars. The auto industry is at it again, working Scott Pruitt at EPA to roll back Obama-era clean car standards. We can’t trust the industry to self-regulate.

Especially since setting proper safety standards for autonomous vehicles is a hard problem to solve. For all the mayhem on our highways, human drivers now drive close to 100 million miles on average before a fatality occurs. The self-driving car industry has logged around 15 million miles of test rides on public roads and has now inflicted its first casualty while testing. Based on these numbers alone, there is a 16 percent chance that the entire self-driving fleet as tested is just as safe as human drivers and simply got unlucky with an early fatality. There’s a better than 4 out of 5 chance, if the technology’s capabilities are uniform across companies, that it is less safe than human drivers. Companies like Waymo have already argued that their technology works better than Uber’s but haven’t given us any way to prove it.

Each company developing self-driving cars would have to log billions of miles of training rides before they could prove statistically that they are substantially safer than humans. Analysts at Rand have shown that too strict a standard for proving safety performance could substantially delay deployment and cost more lives than a less strict standard that allowed the technology to deploy sooner.

The task of weighing potential benefits and costs over time in the face of uncertainty belongs with the public sector, not the companies racing to develop the technology. Despite the promise of breakthrough drugs, for example, we don’t allow pharmaceutical companies to use their own testing procedures and market a new product whenever they say it is ready. Instead, we require them to prove safety and efficacy to the Food and Drug Administration, through a multi-tiered approval process.

The federal regulatory process does pose a challenge for regulating AVs. At the typical pace of U.S. DOT’s rulemaking process, imposing new regulations would take years. We can’t wait that long. Self-driving car technology is developing rapidly, and the public deserves real assurances, as soon as possible, that it will be tested safely and responsibly.

  • thielges

    Expect AV companies to push back hard on any regulation that might reveal their proprietary engineering designs. While it is reasonable for a company that spends heavily in R&D to protect that investment there is hopefully a middle ground that allows the right sorts of information to be shared without giving secrets away to competitors. For example it should be a no brainer that AV companies should release the recorded raw sensor datastreams from collisions or close calls. That would allow competitors and third parties to analyze the real data to determine what improvements could avoid the collision.

  • Larry Littlefield

    “Waymo’s self-driving vehicles average about 5,600 miles between interventions by the human back-up driver, for instance, while Uber’s only average 13 miles.”

    One thing that has absolutely failed is expecting human non-drivers to react when the system fails. That is simply too much to ask. It needs to be the other way around.

    And there is a big difference between deaths and injuries to vehicle occupants, who gain the benefits of automobility associated with the risk of crashes, and the deaths and injuries of cyclists and pedestrians, who are not compensated for that risk and who pose very little and no risk to anyone else.

  • AMH

    Very well said.

  • Larry Littlefield

    Also, they should have started with trains. On the road, the number of possible variables is huge. Trains would be easier.

  • AMH

    Is there any industry that successfully regulates itself? Any case I can think of where we relied on self-regulation (mainly finance and tech, where regulation is slow to catch up to new problems) it was unsuccessful, leading to calls for better regulation after a crisis or scandal. Industry generally opposes regulation of itself (and favors regulation of others), but until the current administration there was at least some balancing of private interest with public interest.

  • To answer the question in the headline, there are actually a variety of reasons why this would be different. Since the companies will be liable for any accidents caused by their software — in most cases due to their deep pockets and the high costs of high-tech litigation much more liable than in the case of ordinary crashes — if they like about their safety and skimp on it, they are only lying to themselves, for they will pay. This is unlike VW emissions where the company was lying to emissions regulators and the air we breathe paid the price.

    Unlike hardware problems (like ignition switches) which cost a fortune to fix in every car, software updates do not cost anything to distribute, though they cost money to create.

    So the key area where you would consider regulation would be areas where a company might be willing to take an unacceptable risk (to itself as well as the the public) because it is foolhardy. Young companies will be foolhardy, mature companies are less likely to.

    One correction learned from this accident is that penalties for wrongful death could be given suitable minimums, to avoid the situation where the person killed/injured is a homeless person or other person that the regular court system does not value as highly. (Here we’re correcting the courts more than the companies, but the result is to assure high penalties in all situations, though of course nobody would say “we can risk being less safe because if we hit a homeless woman, it does not cost as much” so it’s not sure how much extra deterrent this would be.)

    Certainly regulators are completely beyond their competency in working out more than the simplest safety rules for these vehicles. They readily admit this. If safety rules are to be written to govern the companies, there is no choice but to have them be written by the most sophisticated companies, who are inventing brand new safety procedures. This is how automotive regulation usually happens — a company invents seat belts or airbags or whatever, and decades later regulators force all the other companies to include them.

  • You might not call it “self” regulation but many industries regulate themselves under the fear of tort judgments and public backlash. In fact, this is the way that most industries are regulated. The auto accident is the most common major tort (by a huge margin) and the best understood. There is a $200B industry in the USA to deal with it.

ALSO ON STREETSBLOG

How the Self-Driving Car Could Spell the End of Parking Craters

|
Here’s the rosy scenario of a future where cars drive themselves: Instead of owning cars, people will summon autonomous vehicles, hop in, and head to their destination. With fewer cars to be stored, parking lots and garages will give way to development, eventually bringing down the cost of housing in tight markets through increased supply. […]