Study: Most Drivers Can’t Differentiate AVs and Driver Assistance Tech
The majority of vehicle owners think that the installation of advanced driver assistance technology qualifies their car as a “fully automated self driving vehicle,” a new study shows — and experts worry that over-confidence could cost lives.
In a new survey, experts at MIT and the market research firm J.D. Power gave 4,000 owners of both standard and semi-autonomous cars a multiple-choice quiz to see how well they understood what an AV actually is, and what level of attention the partially automated vehicles currently available to consumers actually require of them as drivers. Currently, no company has level four or five AV, which the Society of Automotive Engineers respectively defines as a vehicle which can safely perform “all driving tasks in all situations” without human intervention, or which will automatically sense conditions in which it cannot operate safely without a driver’s help, and automatically shut down.
A stunning 55 percent of the respondents, though, selected neither of those options when presented with a list of seven possible definitions of the term “self-driving.” Instead, they selected descriptions that are more accurately applied to the advanced driver assistance systems currently available on many vehicles, like “the system can perform most of the driving tasks in some situations, but might ask the driver to take control at some points” such as when inclement weather confuses a car’s cameras, or “the system can help with speed control and steering at the same time, and the driver would not need to keep their hands on the wheel but would need to remain attentive to the forward roadway.”
Experts worry that the finding could indicate that drivers think advanced driver assistance systems are just as good at preventing crashes as hypothetical autonomous driving technology that has yet to be invented — a dangerous assumption that’s already proven to lead to distracted behavior behind the wheel.
“Organizations working as technology pioneers have the responsibility to create realistic and accurate consumer expectations for what their products can and cannot do,” said Bryan Reimer, Ph.D., a research scientist in the MIT and a co-author of the study. “Consumer overconfidence and lack of knowledge to date can lead to risk taking that will cause the AV industry to hit a lot of potholes.”
Even worse, the researchers found that the phenomenon of “automation complacency” doesn’t go away when drivers learn more about the true capabilities of advanced driver assistance-equipped cars, including when they actually buy and start driving them — and in some cases, automakers themselves may be actively adding to the confusion.
A respectable 37 percent of surveyed drivers accurately understood what an autonomous car can and can’t do, but the researchers found that the ones who reported that they knew “a great deal about AVs” were actually less likely to ace the test — and only 32 percent of current Tesla owners picked either of the right answers.
Elon Musk’s company famously offers both an “autopilot” function and a “full self-driving mode” on its cars, neither of which actually make the vehicles autonomous. That misleading branding strategy has made the automaker the target of lawsuits from their customers and the bereaved families of Tesla drivers who died because they trusted the company’s tech too much, as well as prompting an investigation from the National Transportation Safety Board.
“Where I get concerned is the language that’s used to describe the capabilities of the vehicle,” NTSB Chairwoman Jennifer Homendy recently told the New York Times. “It can be very dangerous.”
But the new report is short on solutions on how to actually address all this dangerous consumer confusion about automation tech — not to mention implementing systemic safeguards against automation complacency, from which even the best-informed drivers aren’t totally immune.
Hearteningly, a substantial majority of survey respondents indicated they were willing to attend education classes about the proper operation of cars at all levels of automation; 53 percent were even willing to take a driver’s ed course on the topic, something about about one-third of states don’t require drivers to do before they get a license to operate a non-autonomous car. But the researchers didn’t provide any data on driver support for strategies that aren’t rooted in education, like installing in-cabin monitoring systems that warn distracted drivers to keep their eyes on the road, hands on the wheel, and butt in the driver’s seat, or enforcement measures that penalize people who are caught over-relying on ADAS and endangering themselves and other road users. Needless to say, penalizing automakers like Tesla, which recently had the audacity to install dashboard gaming systems that drivers can use while their non-autonomous cars are in motion, didn’t merit a mention, either.
With true autonomous vehicles still a long way off, some advocates think it’s past time for regulators to take aggressive action to prevent distraction amongst drivers — whether the cars they pilot come equipped with advanced tech or not. And forcing the gizmo-obsessed Musks of the world to be honest about their products’ limited safety capabilities certainly couldn’t hurt, either.
If you still think Elon Musk has a constructive approach to road safety, check out @nealboudette's excellent piece in the NY Times today.
I don't see how anyone can defend this.https://t.co/yp2uoUGQAf pic.twitter.com/XcBFvWD8AF
— David Zipper (@DavidZipper) December 6, 2021