Professor Kami Chavis Simmons writes about predictive policing technology in The New York Times
Research | Comments Off
The New York Times
November 18, 2015
Professor Kami Chavis Simmons, a former federal prosecutor and director of Wake Forest Law’s Criminal Justice Program, authored “Police Technology Shouldn’t Replace Community Resources,” which was published in The New York Times on Wednesday, Nov. 18.
She is a frequent contributor to national and international media outlets and has appeared on CNN, CTV and NPR. She has also written for The Huffington Post and has been quoted in the Wall Street Journal, BBC News, U.S. News and World Report, International Business Times, Deutsche Welle and other outlets regarding police accountability and the structural reform of law enforcement agencies.
The original story, which can be found here., follows:
Law-enforcement agencies should use technology as part of a strategy to increase public safety. But as the President’s Task Force on 21st Century Policing stated, “While technology is crucial to law enforcement, it is never a panacea.”
Police should avoid over-reliance on algorithms and make sure residents are a part of any plan to reduce crime.
The task force cautioned that technology “can have unintended consequences for both the organization and the community it serves” and warned that agencies “must pay close attention to community concerns about its use.”
These words are especially salient with respect to predictive policing software that some researchers claim is better than human analysts at determining where crime is likely to occur and, thus, preventing it.
For example, a spate of car thefts on one block might mean that an auto-theft ring is targeting that particular area. A police department responds accordingly by implementing proactive interventions such as placing additional officers on patrol or setting up surveillance cameras to monitor surrounding areas. When public safety budgets are tight, strategically deploying limited resources in areas most likely to experience crime can have a significant impact.
Given this technology’s potential benefits, it is obvious why many police departments have plans to acquire it. But agencies must consider the limitations of this technology, as well as its potential impact on how these areas are policed.
Algorithms cannot inform police about the underlying conditions in the “hot spot” that contribute to crime in that area. A computer cannot tell the police department that rival gangs that are about to engage in a violent confrontation about territory, but a local resident could. Thus, departments should avoid over-reliance on these technologies and must continue to focus on policing models that build community-police partnerships.
In a climate where many racial minorities, particularly those living in impoverished urban areas, already believe they experience unfair police scrutiny, the department must resist using aggressive police tactics that further erode the trust and legitimacy of the department. Residents should be a part of implementing whatever interventions the department believes will be helpful in reducing crime.
This technology creates the risk that police view everyone in the “hot spot,” even the law-abiding residents, as a potential threat. This phenomenon creates tension and further destabilizes an area most in need of police protection.
These concerns do not mean that police should avoid using potentially valuable crime-reduction technologies, but they underscore that police training and community engagement remain important pieces in the mosaic of criminal justice reforms.