A new study came out earlier this month that put South Carolina at the bottom of the list of safe states to live in. This report and others like it are based partly or fully on crime statistics from the Federal Bureau of Investigation’s Uniform Crime Reporting Program.
And like those other reports, you should take this latest one with a grain of salt. Rankings based on FBI crime statistics carry an air of certitude that doesn’t stand up to scrutiny.
The FBI’s most prominent set of crime statistics counts certain categories of crimes that are known to the police. This means reported to the police by the public or discovered by the police as they carry out their patrol and investigative responsibilities. These numbers are most useful for monitoring police department workloads and helping administrators plan deployment and estimate future needs.
But when we use them to rank states or cities on public safety, we start asking questions that the data are not well suited to answer. In fact, the problem is so well known that both the FBI and the American Society of Criminology have specifically asked researchers and reporters not to do this.
Why is it a problem to use the FBI crime statistics this way? First, there are always some police departments that have missing information. While the crime statistics program typically covers more than 90 percent of the U.S. population, state and local coverage varies from place to place and from one year to the next. For example, Columbia was not included in the FBI’s crime statistics program in 2011 and 2012, but it was included in 2010 and 2013.
This latest study also includes drug arrests. The link between drug arrests and home and community safety could be debated, but the problem here is that only 73 percent of arrests were included in the FBI database in 2013 — much lower than for crimes.
Another problem is that many crimes — with real perpetrators and real victims — are not reported to the police. Consider aggravated assault, for which the study ranked South Carolina fifth worst in 2013. Each year, the United States conducts the National Crime Victimization Survey, which asks a nationally representative sample of respondents about their victimization experiences. Nationally, 64.3 percent of aggravated assault victims said their victimization had been reported to the police. We don’t have S.C. data, but we know that the figure in the South is 68.5 percent.
This raises an important set of questions: Are the FBI’s assault rates higher in South Carolina because there are more aggravated assaults in South Carolina? Or is it because aggravated assaults are more likely to come to the attention of the police in South Carolina? Maybe it’s some of both. Would the rankings change if we were able to adjust for reporting differences? The data are simply not strong enough to answer these questions.
People refrain from reporting crime to the police for many reasons, from fear of retaliation, to a preference for resolving the matter privately, to concerns about whether the police could do anything. There is no reason to assume that these non-reporting rates are the same from state to state. And if one state’s law enforcement agencies report more crime to the FBI’s crime statistics program than another state’s, it doesn’t follow that there is any meaningful safety difference between the two states.
South Carolina may rank near the bottom of the states in certain aspects of health and safety, but lists of crime and arrest rankings based on FBI statistics do not bring our understanding of this issue to higher ground.
Dr. Brame is a professor in USC’s Department of Criminology and Criminal Justice; contact him at BRAMER@mailbox.sc.edu.