Here’s the thing about car crashes: They are, blessedly, pretty rare. In the US, nine people are injured in motor vehicle crashes for every 100 million miles traveled in cars, according to data from the National Highway Traffic Safety Administration.
Here’s the thing about computer-based models: They’re not great at predicting rare events. “Accidents are going to be rare anyway, and models tend to miss rare events because they just don’t occur frequently enough,” says Tristan Glatard, an associate professor of computer science at Concordia University, where he’s working with colleagues to build models that might predict car crashes before they happen. “It’s like finding a needle in a haystack.”
Some good things might happen if someone could find that needle, if they managed to transform streets and roads into streams of data, and predict what might happen there. Emergency responders might arrive at crashes a bit faster. Government officials might spot a problematic road—and fix it.
OK, it’s not quite predicting the future. But it’s getting eerily close. So even though it’s hard, and often expensive, and always complicated, cities, researchers, and the federal Department of Transportation are working to do just that.
In May, a team of medical researchers with UCLA and University of California, Irvine published a paper in the journal Jama Surgery suggesting that places in California might be able to use data from the crowdsourced traffic app Waze to cut emergency response times. (Waze has four-year-old program that gives cities traffic data in exchange for real-time information about problems its users might want to avoid, like sudden road closures.) By comparing the data from the Google-owned service with crash data from the California Highway Patrol, the researchers concluded that Waze users notify the app of crashes an average of 2 minutes, 41 seconds before anyone alerts law enforcement.
That almost-three minutes of lead time might not always be the difference between life and death, says Sean Young, a professor of medicine at UCLA and UCI who serves as the executive director of the University of California Institute for Prediction Technology. But “if these methods can cut the response time down by between 20 to 60 percent, then it’s going to have the positive clinical impact,” he says. “It’s generally agreed upon that the faster you get into the emergency room, the better the clinical outcomes will be.”
Last year, the Transportation Department’s Volpe Center wrapped up its own analysis of six months of Waze and accident report data from Maryland, and found something similar: Its researchers could build a computer model from the crowdsourced info that closely followed the crashes reported to the police. In fact, the crowdsourced data had some advantages over the official crash tallies, because it caught crashes that weren’t major enough to be reported, but were major enough to cause serious traffic slowdowns. The government researchers wrote that the model could “offer an early indicator of crash risk,” identifying where crashes might happen before they do.
Now, the DOT is funding additional research, this time with cities that might actually use the data. In Tennessee, government researchers are working with the Highway Patrol to incorporate Waze data into the state’s crash-prediction model, with the hopes of making it accurate down to an hour inside one-square-mile grid, instead of the current four hours within a 42-square-mile grid. In Bellevue, Washington, the DOT has helped to build an interactive dashboard that officials can use to identify crash patterns and risks. If a bunch of crashes are happening in the same section of roadway, “then the heatmap starts glowing,” says Franz Loewenherz, a Bellevue transportation planner. The city might then start collecting data from local traffic cameras to look for causes.
The City of Bellevue, Washington
Bellevue is a nice test case for this kind of data experiment because it’s already very good at collecting and coordinating data from police crash reports and 911 calls to tweak its transportation. (Many places struggle to even put its police crash reports in forms that are useful to road planners, so that they might spot persistent crash patterns.)
Persn riding a bicycle may tranditon, that transition point is an enduring conflict
DOT can use Bellevue to test how close the crowdsourced traffic data is to what’s actually happening on the ground.
But it will take a lot of work before these sorts of traffic data experiments go mainstream—in part because few places are like Bellevue. “You have to have a lot of data, and diverse types of data, and then be able to analyze it for it to be actionable instead of just piling up,” says Christopher Cherry, an engineering professor with the University of Kentucky who recently completed a study of how traffic data could be used to improve road safety. The traffic data itself is useful, sure. But to predict the risk of crashes, and to prevent them, you should also probably have a sense for where crashes are happening, and what the roads in question look like, and how those roads perform under different weather conditions. And then you have to link all those datasets up and help them “talk” to each other—no small feat.
Back at UCLA and UCI, researchers are trying to figure out how they massage the Waze traffic to make it more accurate. There’s a good reason that Google traffic data can’t be subbed for 911 calls, says Young, the researcher: There are still plenty of “false positives” when traffic data identifies a crash that isn’t there, or isn’t serious enough to warrant medical attention. “If you use Waze data as the gold standard and any time a Waze user reports a car crash, you alert police departments, then you’re diverting them from all kinds of other resources needed for crime, for public health, and safety,” he says.
Glatard and his team at Concordia, in Montreal, recently released a paper suggesting they could combine three datasets—on the city’s road networks, on its crashes, and on its weather—to predict where crashes might happen with 85 percent accuracy. But about one out of every eight crashes it predicts never end up happening. Eventually, he’d like to see city authorities use this kind of info to route drivers around streets that get especially dangerous when it snows. But first, he wants to train the model on more data—datasets on Montreal traffic, and Montreal public transportation, and the way Montreal drivers drive. “Models work as long as we have good data sources, and a lot of them,” he says. So before anyone can see crashes before they happen, Minority Report-style, they have to get collecting.