Skip to Content
Streetsblog USA home
Streetsblog USA home
Log In
Autonomous cars

Study: AVs May Not Detect Darker-Skinned Pedestrians As Often As Lighter Ones

A PDS senses three white-presenting pedestrians in its path. Source: Virginia Department of Transportation via Creative Commons.

Driverless cars are worse at detecting darker skin pigments, meaning that autonomous vehicles might not solve the already disproportionate pedestrian death toll faced by black communities, according to a new study.

The facial and body recognition technology built into many pedestrian detection systems does not recognize and react to darker-skinned people as consistently as it does lighter-skinned people, according to the study from Georgia Tech. The researchers studied several leading technologies, and found that they were consistently between 4 and 10 percent less accurate when they encountered images of human figures with skin types four, five and six on the Fitzpatrick scale, a commonly-used scientific tool used to differentiate between human skin colors in machine learning contexts.

The Fitzpatrick scale
The Fitzpatrick scale
The Fitzpatrick scale

Many Black, indigenous and other people of color fall into the higher end of the Fitzpatrick scale; nationally, people of color are significantly more likely to be killed in pedestrian crashes with motor vehicles.

Source: Smart Growth America.
Source: Smart Growth America
Source: Smart Growth America.

What may be more troubling, though, is why automated detection systems are so bad at detecting black and brown skin. Researchers point to two primary reasons — and both have to do with the inequitable data sets that companies use to train computers that are supposed to be smarter than human drivers.

First, and most outrageously: the researchers found that, on average, the training data set that many pedestrian detection systems use had roughly 3.5 times more examples of lower-Fitzpatrick scored (read: typically white) pedestrians compared to higher-ranked (Black and brown) pedestrians. Put another way: pedestrian detection systems are typically better at saving the lives of walkers who look like the walkers they've seen most often in their training models — and the computers are looking at way fewer images of dark-skinned people.

Second, researchers found that many of the data sets themselves were fairly small — and as anyone who's taken a high school statistics class knows, too-small data sets can often make for dangerous actions based on bad analysis. That's part of why people who build machine learning and artificial intelligence systems are actively working to expand the data available to train pedestrian detection systems; if you've ever "proved you're human" to your email server by picking out the crosswalk in a Captcha photograph, you probably added to an automated vehicle's training set without even knowing it. But even after years of efforts, it still hasn't been enough.

Automated vehicles might be getting incrementally smarter every day, but as countless studies before this one have flagged, they're nowhere near smart enough to be on our roads yet — even though they are on our roads anyway, and have already killed pedestrians.

And moreover, we still haven't proven that pedestrian detection systems will ever be smart enough to prevent 100 percent of car crashes and save 100 percent of pedestrian lives. A recent study from the Insurance Institute for Highway Safety found that even if every single car on the road were made autonomous tomorrow, at least 66 percent of crashes would still happen. According to the study, that's in large part because AVs just aren't up to the complex task of piloting a dangerous machine through a dangerous built environment, and require all-too-dangerous human beings, complete with their implicit racial biases, to help them out.

“AVs need not only to obey traffic laws, but also to adapt to road conditions and implement driving strategies that account for uncertainty about what other road users will do, such as driving more slowly than a human driver would in areas with high pedestrian traffic or in low-visibility conditions," the study said.

That paragraph should trouble anyone who wants drivers or "driverless" cars to stop killing Black people — and sadly, that time isn't coming anytime soon. In the short term, the best approach would be to dismantle the underlying racist causes of traffic violence that disproportionately affect communities of color — and stop hoping that miracle tech will make the problem go away.

Stay in touch

Sign up for our free newsletter

More from Streetsblog USA

Friday Video: Traveling Without the Car

City Nerd focuses on the cities where it's easiest to get into town without a car.

December 20, 2024

Friday’s Headlines Share and Share Alike

It's pretty clear that bike- and scooter-shares reduce car trips, but it may be time to consider a subsidized or nonprofit model for car-shares as well.

December 20, 2024

Inside California’s Messy E-Bike Voucher Launch

Over 100,000 Californians tried to grab 1,500 e-bike vouchers in less than an hour. But does that mean the launch was bungled?

December 19, 2024

Talking Headways Podcast: Indianapolis’s Blossoming BRT Network

Austin Gibble on bus rapid transit and cycling in Indiana's capital city.

December 19, 2024
See all posts