It’s not uncommon, here on the internet, to see “rankings” of the most dangerous cities in America. It’s also not uncommon, more broadly in American media and politics, to see labels and references of a parallel nature to these rankings, just spelled out more tactfully than in the *drumroll*…and the number one most dangerous city in America is!…manner the internet likes to adopt.
This desire to compare crime rates in American cities is understandable—treating cities as reference points against one another can go a long way towards understanding how good or bad a particular situation is. We want violent crime rates to decrease. Data is a tool that can help with that.
The problem is this, though: Often, the data shared is misleading.
A quick Google search of violent crime rates by American cities turns up an FBI dataset from 2017 with enormous caveats, a “self-researched” set of rankings with no data citations, and a clickbait slideshow based on the 2017 dataset.
With the FBI data, not only is it out of date, but cities aren’t required to report anything but murder rates under this system, and reporting on the other categories isn’t consistent across cities, nor even across time—most significantly in the category of rape, where rapid cultural changes are helping us better understand the severity of the problem, but also making the data difficult to compare between years.
A “self-researched” set of rankings may be done well. It also might be entirely made up, or built on decisions you would not make if you were “self-researching” the matter yourself.
Clickbait is…clickbait, and the complaints about the 2017 dataset apply.
Even a perfect dataset, reported by city, would have its flaws. Beyond the city-by-city discrepancies in reporting, there’s the significant difference between what’s included in city boundaries—older cities like Philadelphia and Boston are more compact, geographically distinct from suburbs; newer cities like Houston and Phoenix sprawl, including suburban areas in the reach of their city limits. Comparing cities’ respective crime rates is difficult—because comparing cities is difficult. And that’s without intra-city differences in crime rates, like those seen between boroughs in New York and neighborhoods in Chicago.
How, then, to understand the danger of a particular city?
You’re better off comparing cities against themselves over time, and the country against itself over time, and to the degree its comparable, cities against the country in well-defined statistics, like murder rates. For the first—cities against themselves—it’s not uncommon for local media or local research agencies to have strong data available, like the Chicago Tribune’s tracking of shooting victims. For the second and third—the country against itself, and cities against the country—FiveThirtyEight keeps tabs on this from time to time, almost always citing reliable, worthwhile sources.
More broadly, when you see a clickbait piece citing Scranton as the 55th-most dangerous city in the country, look into its methodology, or the data source its pulling from. It’s possible Scranton is the 55th-most dangerous city in the country. It’s possible it’s safer than that. It’s possible it’s more dangerous than that. It’s possible your definition of “dangerous” is different from mine, or from the author of the clickbait. Simply put: Don’t trust those lists unless you’ve researched them yourself.