Topics

How Zombie Crime Stats, Phantom Stats and Frankenstats Paint a Misleading Picture on Crime

In September 2017, newspapers across the country ran headlines of a similar theme: According to data from the FBI’s Uniform Crime Reports, the agency’s official report on criminal behavior nationwide, crime — or at least violent crime — had risen for the second year in a row. That’s not entirely true.

Jorg Greuel for Photodisc via Getty

In September 2017, newspapers across the country ran headlines of a similar theme: According to data from the FBI’s Uniform Crime Reports, the agency’s official report on criminal behavior nationwide, crime — or at least violent crime — had risen for the second year in a row.

That’s not entirely true. “Violent crime” hadn’t risen. The violent crimes that we count — the so-called “index crimes” of murder/manslaughter, rape, robbery, and aggravated assault — had risen. Simple assaults? Sexual assaults that don’t rise to the level of rape? We don’t measure those crimes. The crimes we do measure were all chosen during the development of the UCR in the late 1920s, on the grounds that they were common, serious, and generally reported — which is true, but we haven’t updated the list since.

And even saying that “index violent crimes” rose isn’t quite right. Index violent crimes reported to the police had gone up. But a large fraction of crimes are never reported, perhaps fewer than half of all violent crimes and barely 50 percent of all serious violent crimes. And the widely reported UCR data are based only on crimes recorded by the police.

Well, some of the police. Participation in the UCR is voluntary, so it provides data on index crimes reported to the police by departments that then report to the FBI, with some efforts to fill in the gaps from those that don’t report at all or provide incomplete data. About 5,000 of the nation’s 18,000 or so police agencies — so something on the order of 20 to 25 percent — don’t appear to report sufficient data.

Oh, and the data is nearly an entire year out of date by the time it is reported to the public. The headlines in September 2017 about the rise in violent crimes were about the just-released UCR data… from 2016. Which, to be clear, is as close to just-in-time statistics as criminal justice stats get, but still potentially misleading. The number of homicides in Chicago rose by almost 60 percent from 2015 to 2016, but by the time the 2016 crime stats were released, Chicago was on course to see a 14 percent drop by the end of 2017.

So, “Violent crime is up!” is what the headlines say, but “According to agencies providing data to the FBI, the number of incidents of four serious types of violent crimes reported to or seen by the police rose nearly a year ago” is what they ought to say.

Welcome to the world of criminal justice statistics. At the heart of the push from being tough on crime to smart on crime is a desire to create a criminal justice system based on what works, and that should mean a criminal justice system that has accurate, up-to-date data that can shape and influence policy.

What we have instead is something akin to a horror movie bestiary. We have zombie statistics — numbers that haven’t been updated in years, like a detailed inmate survey that is supposed to be conducted every seven years but was last run in 2004. We have phantom statistics, those numbers that we ought to have but are invisible since we never gather them at all, such as anything on plea bargaining (despite the fact that about 95 percent of all guilty verdicts come from pleas… we think). And, perhaps worst of all, we have Frankenstatistics, those numbers that at first blush seem to measure one thing, but when looked at closely are tracking something altogether different.

Recidivism stats, for instance, are completely blind to an entire way of thinking about trends in reoffending. They can’t measure if someone is committing fewer crimes than before, only if he or she manages to completely avoid re-arrest. And they don’t really measure the trend they purport to measure in the first place (since they don’t track if the person fails to reoffend, only if he or she fails to be rearrested, which depends a lot on what the police are doing).

Far too often, journalists and policymakers alike invoke zombie statistics without acknowledging that they may no longer reflect current conditions, they cite Frankenstats at face value without considering what they are reallymeasuring, and they rely on anecdotes to fill the gaps left by the phantoms. None of these practices is acceptable, even if some of them often feel unavoidable. (I myself have been forced to rely on anecdotes more than I’d like.)

But it is also understandable. The defects in our criminal justice statistics are buried deep in the fine print, invisible to all but those who spend their days mired in them. The UCR stats are noisy and complex and imperfect, yet the FBI reports them with such specificity — there were “exactly” 803,007 aggravated assaults 2016, not 803,006 or 803,008 — that most people would likely think they are precisely measured. Our prison population statistics provide detailed national numbers, but unless you have access to the underlying data, which requires an application and a special encrypted hard drive and (it appears) an academic affiliation, you’d never know that Southern states systematically under-report data, which may introduce a bias, though of what sort we can’t really say.

There’s no reason we can’t have better criminal justice data. After all, other agencies produce detailed data far more rapidly. The Bureau of Labor Statistics releases employment data monthly, not with a nearly year-long lag. Of course, the BLS has a budget of almost $650 million, compared to under $40 million for the Bureau of Justice Statistics. Reliable up-to-date statistics cost money, money we’ve been so far unwilling to spend.

But there is some good news on the horizon. The FBI is hoping to complete a decades-long revamp of the UCR by 2021. The BJS is in the process of expanding its important national survey of criminal victimization to help explain what is happening at the local level. And while hamstrung by inadequate budgets, researchers at the BJS continue to work to improve and modernize other datasets as well.

In the meantime, however, it is essential that we understand exactly what our criminal justice statistics can and cannot say, and how they can both inform and mislead, which is what I intend to do here in the months ahead. The increased focus on data-driven criminal justice policy is an essential step forward, but it has to be done with a unflinching appreciation of just what that data looks like.