Why U.S. Car Crash Reporting Is Broken
Almost a third of car crashes involving a vulnerable road user go unreported in the nation’s capital, skewing District crash totals — but underreporting is far from the only thing wrong with reporting standards across the U.S., advocates say.
Last week, Citylab broke an explosive Washington, D.C., study that showed police had failed to record as much as 30 percent of 911 calls about drivers striking pedestrians, bicyclists, and other road users, with crashes going unreported most often in Black and brown neighborhoods.
The data gap might be explained, in part, by the fact that police are generally required to record a crash only if someone required medical transport for a serious injury, or if a motorist is likely to make an insurance claim. Bicyclists and users of other l0w-carbon vehicles typically don’t have vehicle insurance, but crashes involving them are still critical data points that city leaders rely on to shape their Vision Zero plans.
Add in the distrust that BIPOC communities often feel toward police, and D.C. advocates say that as much as 40 percent of crashes in mostly Black areas may go uncounted — so District leaders may not even know where its most dangerous corridors are, much less have the specific data they need to fix them.
But the underreporting of non-fatal crashes isn’t the only reason why U.S. communities don’t have a full picture of our national traffic-violence epidemic. Here are three fundamental problems with local and national crash-reporting standards — and how to fix them.
1. There is no mandatory federal crash-reporting standard
The national Fatality Analysis and Reporting System contains a universe of useful data about the 6.5 million car crashes that happen on U.S. roads in an average year. But that data is all built on crash reports from tens of thousands of police departments — and each of those departments gets to decide which details about those crashes they think should count.
Not surprisingly, a lot of the critical information falls through the cracks. A 2016 report from the National Safety Council found that crash reports in 26 states lacked a field for officers to record whether drivers were texting, and that no state crash reports included a field to record whether drivers were fatigued, despite the fact that the National Highway Traffic Safety Administration had conducted major public awareness campaigns on both issues.
Some states have updated their reporting standards recently — but not all cities and municipalities in those states have followed suit. In 2019, for instance, 44 percent of officers simply didn’t report whether a driver was distracted in the event of a crash, nor did they note whether drivers were distracted by common behaviors, such as texting — likely because local forms don’t require them to do so.
2. The standard that NHTSA recommends is deeply flawed
States aren’t totally flying blind on crash reporting, but federal guidelines are voluntary — and not they’re especially forward-thinking.
Since its initial publication in 1998, the little-known Model Minimum Uniform Crash Criteria has advised states on the collection of hundreds of data points on crashes, including hundreds of details on roadway features, vehicles, drivers, and other road users. But some of the most widely-recognized causes of our traffic violence epidemic still aren’t included in the 236 page tome.
For instance, the MMUCC doesn’t ask officers to note how far a walker who’s struck by a driver might be from the nearest unobstructed crosswalk; it simply advises them to note whether the walker was in a crosswalk or not, even if the closest one was miles away. Vehicle height and weight aren’t requested, either, despite the fact that the growing bulk of SUVs and pick-ups is well known to be accelerating U.S. walking fatalities.
Comparatively, the details that officers are advised to collect about vulnerable road users are weirdly granular. There’s even a separate form to detail whether pedestrians and cyclists were wearing safety equipment, with separate codes for “reflectors,” “reflective clothing,” and “lighting,” such as flashlights.
There are some efforts to paint a fuller picture of the factors commonly involved in car crashes — but they’re not based on police reports alone. A recent study conducted by advocates in Portland, Ore., for instance, compiled law-enforcement data alongside hospital records, media reports, roadway measurements, and even manual inspection of adjacent land uses on Google Maps. Similar efforts are underway in San Francisco.
Some advocates say the MMUCC should include many of those factors its next edition (2022) — and the burden of collecting the data should be shared among departments of transportation, health, and other city offices, rather than left to the cops.
3. A lot is left up to officer discretion
Of course, even the perfect crash report might not give us the data we need to really understand our traffic violence crisis — at least if the officials filling it out rely on opinion rather than fact.
Law enforcement officers are often given wide latitude to guess at whether behaviors like speeding, distraction and erratic driving contributed to crashes, even when more hard data about what actually happened is readily available. Last year, reps for the U.S. Department of Transportation even acknowledged that as many as two-thirds of the “drunk walking” deaths the agency mentions in its safety campaigns were not verified by blood tests of the dead pedestrians, but instead based on “imputed blood-alcohol content” or an officer’s professional opinion of a walkers’ sobriety — opinions presumably based on the testimony of the driver and other witnesses, because, of course, the walker died.
Wow. I’d hope that an agency dealing with massive feats of engineering would have a better grasp of basic physics than this.
Drinking and walking doesn’t kill people. Drinking and driving absolutely does. Physics can help you understand why.
— no justice no streets (@happifydesign) November 18, 2020
Simply put, officers also don’t always have the tools or the bandwidth to make accurate crash reports — which is why some advocates want to give reporting responsibilities to agencies with better resources and specialized training. But until that happens, police may keep leaning on the same, simplistic explanations that perpetuate the corrosive myth that individual road-user error accounts for almost all crashes, rather than collecting data that could identify more complex, systemic factors.
“At least anecdotally, a lot of officers will default to explanations [for crashes] that frankly, often eye witnesses don’t even agree with,” said Rohit T. Aggarwala, senior fellow at Cornell Tech and author of an op-ed encouraging Secretary Buttigieg to reform FARS. “Excessive speed seems to be the default for many things; driver inattention is another one you hear a lot, too.”
But Aggarwala hopes that, with the right reforms, much subjectivity can be stripped from crash reporting — especially with former management consultant Pete Buttigieg at the helm of the DOT.
“Guys who work at places like McKinsey are usually pretty obsessed with data,” said Aggarwala. “To bring a former consultant’s kind of perspective to this could be great to help identify where there are gaps that need to be filled…It would definitely be great to see the data cleaned up, and more importantly, to see the data matter.”