The examples and perspective in this article may not represent a worldwide view of the subject. (December 2019) (Learn how and when to remove this template message)
Predictive policing refers to the usage of mathematical, predictive analytics, and other analytical techniques in law enforcement to identify potential criminal activity. Predictive policing methods fall into four general categories: methods for predicting crimes, methods for predicting offenders, methods for predicting perpetrators' identities, and methods for predicting victims of crime.
The technology has been described in the media as a revolutionary innovation capable of "stopping crime before it starts". However, a RAND Corporation report on implementing predictive policing technology describes its role in more modest terms:
- Predictive policing methods are not a crystal ball: they cannot foretell the future. They can only identify people and locations at increased risk of crime ... the most effective predictive policing approaches are elements of larger proactive strategies that build strong relationships between police departments and their communities to solve crime problems.
In November 2011, TIME Magazine named predictive policing as one of the 50 best inventions of 2011, using the term "pre-emptive policing". In the United States, the practice of predictive policing has been implemented by police departments in several states such as California, Washington, South Carolina, Alabama, Arizona, Tennessee, New York, and Illinois.
Predictive policing uses data on the times, locations and nature of past crimes, to provide insight to police strategists concerning where, and at what times, police patrols should patrol, or maintain a presence, in order to make the best use of resources or to have the greatest chance of deterring or preventing future crimes. This type of policing detects signals and patterns in crime reports to anticipate if crime will spike, when a shooting may occur, where the next car will be broken into, and who the next crime victim will be. Algorithms are produced by taking into account these factors, which consist of large amounts of data that can be analyzed. The use of algorithms creates a more effective approach that speeds up the process of predictive policing since it can quickly factor in different variables to produce an automated outcome. From the predictions the algorithm generates, they should be coupled with a prevention strategy, which typically sends an officer to the predicted time and place of the crime. The use of automated predictive policing supplies a more accurate and efficient process when looking at future crimes because there is data to back up decisions, rather than just the instincts of police officers. By having police use information from predictive policing, they are able to anticipate the concerns of communities, wisely allocate resources to times and places, and prevent victimization. Predictive policing is an add on to hot spot policing, which is effective and promising in decreasing crime and offenses. Hot spot policing also tends to focus on urban locations, or small areas in general, where crime is high.
Police may also use data accumulated on shootings and the sounds of gunfire to identify locations of shootings. The city of Chicago uses data blended from population mapping crime statistics, and whether to improve monitoring and identify patterns. PredPol, founded in 2012 by a UCLA professor, is one of the market leaders for predictive policing software companies. Its algorithm is formed through an examination of the near-repeat model, which infers that if a crime occurs in a specific location, the properties and land surrounding it are at risk for succeeding crime. This algorithm takes into account crime type, crime location, and the date and time of the crime in order to calculate predictions of future crime occurrences. Another software program that is utilized for predictive policing is operation LASER, which is used in Los Angeles to attempt to reduce gun violence. However, LASER was discontinued in 2019 due to a list of reasons, but specifically because of the inconsistencies when labeling people. Furthermore, some police departments have also discontinued their usage of the program given the racial-biases and ineffective methods associated with it. While the idea behind the predictive policing model is helpful in some ways, it has always had the potential to technologically reiterate social biases, which would inevitably increase the pre-existing patterns of inequality.
Attempting to predict crimes within police departments can first be traced back to work conducted by the Chicago School of Sociology on parole recidivism in the 1920s. Involved in this process was sociologist Ernest Burgess, who used the research to craft the actuarial approach. The approach works to find and weigh certain factors that correlate with the prediction of future crime. Soon this spread into various parts of the justice system, leading to the creation of prediction instruments such as the Rapid Risk Assessment for Sexual Offense Recidivism (RRASOR) and the Violence Risk Appraisal Guide (VRAG).
In 2008, Police Chief William Bratton at the Los Angeles Police Department (LAPD) began working with the acting directors of the Bureau of Justice Assistance (BJA) and the National Institute of Justice (NIJ) to explore the concept of predictive policing in crime prevention. In 2010, researchers proposed that it was possible to predict certain crimes, much like scientists forecast earthquake aftershocks.
In 2009, the NIJ held its first predictive policing symposium. At the event, Kristina Rose, acting director of the NIJ, claimed that the Shreveport, Los Angeles, D.C. Metropolitan, New York, Chicago, and Boston Police Departments were interested in implementing a predictive policing program. Today, predictive policing programs are currently used by the police departments in several U.S. states such as California, Washington, South Carolina, Arizona, Tennessee, New York and Illinois. Predictive policing programs have also been implemented in the UK and Europe, for example in Kent County Police and the Netherlands.
From 2012, NOPD started a secretive collaboration with Palantir Technologies in the field of predictive policing. According to the words of James Carville, he was impetus of this project and "[n]o one in New Orleans even knows about this".
In China, Suzhou Police Bureau has adopted Predictive Policing since 2013. During 2015–2018, several cities in China have adopted predictive policing. China has used Predictive Policing to identify and target people for sent to Xinjiang re-education camps.
In 2020 the Fourth Circuit Court of Appeals handed down a decision which found predictive policing to be a law-enforcement tool that amounted to nothing more than reinforcement of a racist status quo. The court also held that to grant the government exigent circumstances exemption in this case would be a broad rebuke to the landmark Terry vs Ohio case which set the standard for unlawful search and seizure. Predictive policing, which is typically applied to so-called 'High crime areas' - "relies on biased input to make biased decisions about where police should focus their proactive efforts", and without it police are still able to fight crime adequately in minority communities.
The effectiveness of predictive policing has been tested through multiple studies with varying findings. In 2015, the New York Times published an article that analyzed predictive policing's effectiveness, citing numerous studies and explaining their results.
A study conducted by the RAND Corporation found that there was no statistical evidence that crime was reduced when private policing was implemented. The study cites that predictive policing is only half of the effectiveness. Carefully executed human action is the second half of its effectiveness. This prediction and execution is highly dependent on the reliability of the input of the data. If the data is unreliable the effectiveness of predictive policing can be disputed.
Another study conducted by the Los Angeles Police Department (LAPD) in 2010, found its accuracy to be twice that of its current practices. In Santa Cruz, California, the implementation of predictive policing over a 6-month period resulted in a 19 percent drop in the number of burglaries. In Kent, 8.5 percent of all street crime occurred in locations predicted by PredPol, beating the 5 percent from police analysts.
A study from the Max Planck Institute for Foreign and International Criminal Law in an evaluation of a 3-year pilot of the Precobs (pre crime observation system) software said no definite statements can be made about the efficacy of the software. The 3-year pilot project will enter a second phase in 2018.
A particular strategy of predictive policing called hot spot policing has had a positive effect on crime. Evidence provided by the National Institute of Justice shows that this method has decreased the frequency of multiple, violent, and drug and alcohol offenses among others. However, without careful execution and sufficient data implementation this method can perpetuate implicit bias and racial profiling.
According to the RAND Corporation study, the quality of data used for predictive policing can be severely insufficient if data censoring, systematic bias, and relevance is deficient. Data censoring is the implementation of data that omits crime in certain areas. Systematic bias can result when data is collected that shows a certain number of crimes, but does not sufficiently report when the crimes took place. Relevance is the usefulness of data that drives predictive policing.
Documentation of these deficiencies have been reported to cause ineffective and discriminatory policing. One specific data collection reported on the “Disproportionate Risks of Driving While Black.” This report showed that black drivers were significantly more likely to be stopped and searched while driving. These biases can be fed into the algorithms used to implement predictive policing and lead to higher levels of racial profiling and disproportionate arrests.
According to the RAND study, the effectiveness of predictive policing requires and depends on the input of data that is high in quality and quantity. Without thoroughly sufficient data, predictive policing results in negative and inaccurate outcomes. Furthermore, it is also cited that predictive policing is inaccurately referred to as the “end of crime.” However, the effectiveness of predictive policing depends fundamentally on the tangible action taken based on predictions.
A coalition of civil rights groups, including the American Civil Liberties Union and the Electronic Frontier Foundation issued a statement criticizing the tendency of predictive policing to proliferate racial profiling. The ACLU's Ezekiel Edwards forwards the case that such software is more accurate at predicting policing practices than it is in predicting crimes.
Some recent research is also critical of predictive policing. Kristian Lum and Isaac William have examined the consequences of training such systems with biased datasets in 'To predict and serve?'. Saunders, Hunt and Hollywood demonstrate that the statistical significance of the predictions in practice verge on being negligible.
In a comparison of methods of predictive policing and their pitfalls Logan Koepke comes to the conclusion that it is not yet the future of policing but 'just the policing status quo, cast in a new name'.
In a testimony made to the NYC Automated Decision Systems Task Force, Janai Nelson, of the NAACP Legal Defense and Educational Fund, urged NYC to ban the use of data derived from discriminatory or biased enforcement policies. She also called for NYC to commit to full transparency on how the NYPD uses automated decision systems, as well as how they operate.
According to an article in the Royal Statistical Society, 'the algorithms were behaving exactly as expected – they reproduced the patterns in the data used to train them' and that 'even the best machine learning algorithms trained on police data will reproduce the patterns and unknown biases in police data'.
In 2020, following protests against police brutality, a group of mathematicians published a letter in Notices of the American Mathematical Society urging colleagues to stop work on predictive policing. Over 1,500 other mathematicians joined the proposed boycott.
Some applications of predictive policing have targeted minority neighborhoods and lack feedback loops.
Cities throughout the United States are enacting legislation to restrict the use of predictive policing technologies and other “invasive” intelligence-gathering techniques within their jurisdictions.
Following the introduction of predictive policing as a crime reduction strategy, via the results of an algorithm created through the use of the software PredPol, the city of Santa Cruz, California experienced a decline in the number of burglaries reaching almost 20% in the first six months the program was in place. Despite this, in late June of 2020 in the aftermath of the killing of George Floyd in Minneapolis, Minnesota along with a growing call for increased accountability amongst police departments, the Santa Cruz City Council voted in favor of a complete ban on the use of predictive policing technology.
Accompanying the ban on predictive policing, was a similar prohibition of facial recognition technology. Facial recognition technology has been criticized for its reduced accuracy on darker skin tones - which can contribute to cases of mistaken identity and potentially, wrongful convictions.
In 2019, Michael Oliver, of Detroit, Michigan, was wrongfully accused of larceny when his face registered as a “match” in the Data Works Plus software to the suspect identified in a video taken by the victim of the alleged crime. Oliver spent months going to court arguing for his innocence - and once the judge supervising the case viewed the video footage of the crime, it was clear that Oliver was not the perpetrator. In fact, the perpetrator and Oliver did not resemble each other at all - except for the fact that they are both African-American which makes it more likely that the facial recognition technology will make an identification error.
With regards to predictive policing technology, the mayor of Santa Cruz, Justin Cummings, is quoted as saying, “this is something that targets people who are like me,” referencing the patterns of racial bias and discrimination that predictive policing can continue rather than stop.
For example, as Dorothy Roberts explains in her academic journal article, Digitizing the Carceral State, the data entered into predictive policing algorithms to predict where crimes will occur or who is likely to commit criminal activity, tends to contain information that has been impacted by racism. For example, the inclusion of arrest or incarceration history, neighborhood of residence, level of education, membership in gangs or organized crime groups, 911 call records, among other features, can produce algorithms that suggest the over-policing of minority or low-income communities.
- Quantitative methods in criminology
- Carding (police policy)
- Crime analysis
- Government by algorithm
- "The Minority Report"
- Crime hotspots
- Racial profiling
- Rienks R. (2015). "Predictive Policing: Taking a chance for a safer future".
- The Role of Crime Forecasting in Law Enforcement Operations
- Joel Rubin (21 August 2010). "Stopping crime before it starts". The Los Angeles Times. Retrieved 19 December 2013.
- "The 50 Best Inventions". Time. 28 November 2011. Retrieved 19 December 2013.
- Friend, Zach. "Predictive Policing: Using Technology to Reduce Crime". FBI Law Enforcement Bulletin. Federal Bureau of Investigation. Retrieved 8 February 2018.
- Levine, E. S.; Tisch, Jessica; Tasso, Anthony; Joy, Michael (February 2017). "The New York City Police Department's Domain Awareness System". Interfaces. 47 (1): 70–84. doi:10.1287/inte.2016.0860.
- 179 (2020-04-01). "Predictive Policing Explained | Brennan Center for Justice". www.brennancenter.org. Retrieved 2020-11-19.CS1 maint: numeric names: authors list (link)
- National Academies of Sciences, Engineering (2017-11-09). Proactive Policing: Effects on Crime and Communities. ISBN 978-0-309-46713-1.
- National Academies of Sciences, Engineering (2017-11-09). Proactive Policing: Effects on Crime and Communities. doi:10.17226/24928. ISBN 978-0-309-46713-1.
- "Practice Details". CrimeSolutions, National Institute of Justice. Retrieved 2020-11-19.
- "Violent crime is down in Chicago". The Economist. Retrieved 2018-05-31.
- "Predict Prevent Crime | Predictive Policing Software". PredPol. Retrieved 2020-11-19.
- "NCJRS Abstract - National Criminal Justice Reference Service". www.ncjrs.gov. Retrieved 2020-11-19.
- "LAPD ends another data-driven crime program touted to target violent offenders". Los Angeles Times. 2019-04-12. Retrieved 2020-11-19.
- Winston, Ali (2018-04-26). "A pioneer in predictive policing is starting a troubling new project". The Verge. Retrieved 2020-11-19.
- Brayne, Sarah (2017). "Big Data Surveillance: The Case of Policing". American Sociological Review. 82.
- Ferguson, Andrew G. "Policing Predictive Policing". Retrieved 17 November 2020.
- Walter L. Perry (2013). Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations. RAND Corporation. p. 4. ISBN 978-0833081551.
- Rose, Kristina (November 19, 2009). "Predictive Policing Symposium: Opening Remarks" (PDF). Los Angeles, CA. p. 16. Retrieved November 13, 2020.
- "Predictive Policing day of action targets burglars". Kent Police. Archived from the original on 2014-05-02.
- Winston, Ali (27 February 2018). "Palantir has secretly been using New Orleans to test its predictive policing technology". The Verge. Retrieved 23 April 2020.
- ""大数据"给公安警务改革带来了什么" (in Chinese). 2014-10-09. Archived from the original on 2018-12-21. Retrieved 2015-04-21.
- "Exposed: China's Operating Manuals For Mass Internment And Arrest By Algorithm". ICIJ. 2019-11-24. Retrieved 2019-11-26.
- "'Big data' predictions spur detentions in China's Xinjiang: Human Rights Watch". Reuters. 2018-02-26. Retrieved 2019-11-26.
- Cushing, Tim. "Appeals Court Bashes Predictive Policing And The Judge Who Argued People In High Crime Areas Want Fewer Rights". Techdirt.
- "techdirt: Appeals Court Bashes Predictive Policing And The Judge Who Argued People In High Crime Areas Want Fewer Rights |".
- https://assets.documentcloud.org/documents/6997575/predpol4thCirc.pdf pg.31-32, 33-37
- Patel, Faiza (November 18, 2015). "Be Cautious About Data-Driven Policing". www.nytimes.com.
- Perry, Walter L.; McInnis, Brian; Price, Carter C.; Smith, Susan; Hollywood, John S. (2013-09-25). "Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations". Cite journal requires
- "Don't even think about it". The Economist. 20 July 2013. Retrieved 20 December 2013.
- "IfmPt - Institut für musterbasierte Prognosetechnik". www.ifmpt.com (in German).
- "Predictive Policing". www.mpicc.de. Max Planck Institute for Foreign and International Criminal Law.
- "5 Things You Need to Know About Hot Spots Policing & The "Koper Curve" Theory". National Police Foundation. 2015-06-30. Retrieved 2020-11-20.
- "Hot Spot Policing Can Reduce Crime". National Institute of Justice. Retrieved 2020-11-20.
- LaFraniere, Sharon; Lehren, Andrew W. (2015-10-24). "The Disproportionate Risks of Driving While Black (Published 2015)". The New York Times. ISSN 0362-4331. Retrieved 2020-11-20.
- "Statement of Concern About Predictive Policing by ACLU and 16 Civil Rights Privacy, Racial Justice, and Technology Organizations". American Civil Liberties Union.
- "Predictive Policing Software Is More Accurate at Predicting Policing Than Predicting Crime". American Civil Liberties Union.
- Lum, Kristian; Isaac, William (October 2016). "To predict and serve?". Significance. 13 (5): 14–19. doi:10.1111/j.1740-9713.2016.00960.x.
- Saunders, Jessica; Hunt, Priscillia; Hollywood, John S. (12 August 2016). "Predictions put into practice: a quasi-experimental evaluation of Chicago's predictive policing pilot". Journal of Experimental Criminology. 12 (3): 347–371. doi:10.1007/s11292-016-9272-0.
- Koepke, Logan (21 November 2016). "Predictive Policing Isn't About the Future". Slate.
- Nelson, Janai. "Testimony of Janai Nelson" (PDF). Archived from the original (PDF) on 8 June 2019. Retrieved 8 June 2019.
- Lum, Kristian; Isaac, William (October 2016). "To predict and serve?". Significance. 13 (5): 14–19. doi:10.1111/j.1740-9713.2016.00960.x.
- Linder, Courtney (July 20, 2020). "Why Hundreds of Mathematicians Are Boycotting Predictive Policing". Popular Mechanics. Retrieved July 22, 2020.
- "Where in the World is AI? Responsible & Unethical AI Examples". map.ai-global.org.
- Kristi, Sturgill (June 26, 2020). "Santa Cruz becomes the first U.S. city to ban predictive policing". Los Angeles Times. Retrieved November 21, 2020.
- Simonite, Tom (July 22, 2019). "The Best Algorithms Struggle to Recognize Black Faces Equally". Wired.
- Cushing, Tim (July 14, 2020). "Detroit PD Now Linked To Two Bogus Arrests Stemming From Facial Recognition False Positives". Techdirt. Retrieved November 22, 2020.
- Asher-Schapiro, Avi (June 17, 2020). "In a U.S. first, California city set to ban predictive policing". Reuters.
- Roberts, Dorothy (April 10, 2019). "Digitizing the Carceral State". Harvard Law Review. 132.