Skip to Content
Streetsblog USA home
Streetsblog USA home
Log In
ADAS

How Automakers Can Stop Humans from Over-Relying on Automated Safety Tech

Automakers aren't doing enough to ensure that drivers are ready to take over if their vehicles' self-driving features make mistakes, an auto-industry group argues, re-igniting a debate about who should be held accountable when the drivers of partially automated cars kill people on U.S. roads.

Last week, the Insurance Institute for Highway Safety announced the criteria for a new set of rankings that will evaluate how well automakers are combatting "automation complacency" among their drivers, as well as the intentional misuse of advanced driver assistance systems that are becoming increasingly common on new cars.

First used to describe the phenomenon of pilots mentally checking out in the cockpit when their planes are on "autopilot" mode, "automation complacency" has since been flagged by watchdogs like the National Transportation Safety Board as a major contributing factor in several crashes involving a partially automated vehicle — and for the foreseeable future, every AV on the road will only be partially automated.

In 2020, the Board advised federal regulators to develop "performance standards for driver monitoring systems that will minimize driver disengagement, prevent automation complacency, and account for foreseeable misuse of the automation." Experts point out that many of those systems could be implemented with inexpensive, software updates such as programming advanced features not to activate if a drivers' seatbelt isn't buckled, or by installing simple in-cabin cameras to monitor motorists' behavior.

But the National Highway Traffic Safety Administration has yet to mandate either.

"There's a regulatory void," said David Harkey, president of the Insurance Institute, explaining his group's new rankings. "Automakers need to ensure that this tech is implemented so it does not lead to either intentional or unintentional misuse on the part of the driver. And consumers need to have a better understanding that these remain driver assistance systems, and not driver replacement systems. No one should purchase a vehicle thinking it’s capable of more than it is, and then learn later, sometimes through tragic consequences, that they were wrong."

In the absence of strong federal safety standards that would require companies to put better safeguards on advanced driver assistance systems, some automakers have sold customers their own narratives about what partial automation is really capable of.

In 2020, Tesla began selling an untested "Full Self Driving" mode that was not, as the name suggested, actually self-driving, paired with an owner's manual whose fine print warned customers not to trust the tech too much; it's still on the roads today. GMC, meanwhile, is currently marketing a "Super Cruise" feature with commercials that encourage drivers to operate their cars "hands free," even as a microscopic disclaimer encourages those drivers to stay attentive.

Some pundits have partly blamed that dangerously optimistic advertising for encouraging bad driving behaviors that lead to real-world crashes — and the criminal charges that sometimes follow.

Just before the Institute announced its new rankings criteria, a California driver was charged with vehicular manslaughter after he failed to take control when his Tesla, which was using the company's other misleadingly named driving mode known as "Autopilot," ran a red light at high speed and killed two occupants of a Honda Civic.

The driver may be the first in U.S. history to be charged with a felony for failing to take control of his car when the popular advanced driver assistance system failed. (A test driver for Uber was charged with negligent homicide following a similar 2018 collision in Arizona, but many of the safety features on her car weren't yet being sold to consumers; Tesla, by contrast, has sold Autopilot and the beta version of Full Self Driving software to tens of thousands of untrained motorists).

The Insurance Institute hopes its rankings will help encourage customers to look beyond the automaker hype, if only for their own legal protection — and someday, inspire NHTSA to enforce stronger requirements on semi-automated vehicles overall. To earn a top ranking from the organization, an advanced driver assistance-equipped car will "need to ensure that the driver’s eyes are directed at the road and their hands are either on the wheel or ready to grab it at all times," and issue "escalating alerts and appropriate emergency procedures when the driver does not meet those conditions," including automatically bringing the car to a safe stop.

Those alerts, of course, would help ensure that any driver behave more safely behind the wheel. Even manual motorists are prone to "complacency" and distraction, if only because the tacit design cues of the autocentric roads they drive on often signal that it's okay to tune out.

But experts argue those safety features become even more critical when it comes to semi-automated cars, which have been linked to particularly egregious roadway crimes like sleeping, gaming, or even having sex instead of watching the road.

"We’re helping consumers understand what this tech can and cannot do," said Harkey. "Automakers are advertising these technology in a way that makes it seems like it can do more than it actually can."

Source: IIHS
Source:

At least so far, no automaker can accurately say that their advanced driver safety systems are foolproof — which is why it's so critical that every automaker take steps to make sure that drivers don't misuse them. (Disturbingly, the Insurance Institute notes that "while most partial automation systems have some safeguards in place to help ensure drivers are focused and ready, none of them [currently] meets all the pending IIHS criteria.")

Doing that, though, would require the seemingly paradoxical acknowledgment that partially automated cars might require even more safety backstops than cars designed to be driven by human beings alone — at least as long as the public agrees that the point of advanced driver assistance systems is to protect all road users, rather than to make it easier for drivers to take a nap.

"At some point, we have to ask ourselves: are these, indeed, driver safety systems?" said Harkey. "Or are they driver convenience systems?"

Stay in touch

Sign up for our free newsletter

More from Streetsblog USA

Talking Headways Podcast: City Tech with Rob Walker

Author Rob Walker on how technology has progressed transportation policy in the last decade.

November 21, 2024

One Hidden Reason Why Your State DOT Isn’t Building Protected Bike Lanes

"Proven safety countermeasures" might sound like a wonky engineering term, but it could hold the key to unlocking money to save lives.

November 21, 2024

Thursday’s Headlines Peek at What’s After Pete

The outgoing transportation secretary reflects on the Biden administration's legacy.

November 21, 2024

Opinion: Why I’m Hopeful About Vision Zero, Even Post-Election

"We all know that change is hard, especially at a time when the nation seems so divided. But keeping our loved ones safe is a universal goal."

November 21, 2024

Wednesday’s Headlines Stop Being Polite and Start Getting Real

A new transportation secretary, successful transit referenda, and more in today's headlines.

November 20, 2024
See all posts