Predictive policing algorithms: Put garbage in, get garbage out

Predictive policing algorithms: Put garbage in, get garbage out
By Elise Gonzalez | March 16, 2022


Image source: https://tinyurl.com/nz8n7xda

In recent years, “data-driven decision making” has seen a big increase in use across industries [1]. One industry making use of this approach, which relies on data rather than just human intuition to inform decisions, is law enforcement. Predictive policing tools have been developed to alert police as to where crime is likely to occur in the future, so that they can more effectively and efficiently deter it.

In a different and unbiased world, maybe tools like this would be reliable. In reality, because of the way they are designed, predictive policing tools merely launder the bias that has always existed in policing.

So, how are these tools designed? Let’s use two popular predictive policing softwares as examples: PredPol and Azavea’s HunchLab, which have been used in Los Angeles, New York, and Philadelphia, among other, smaller cities [2]. Each of these companies has designed an algorithm, or a set of instructions on how to handle different situations, that is equipped to rank locations on their relative future crime risk. These algorithms base that risk on any past instances of crime at or around each location. That information comes from historical policing data. PredPol uses addresses where police have made arrests or filed crime reports; HunchLab uses the same, as well as addresses to which police have been called in the past [3, 4]. This information is presented to the algorithm as a good and true indicator of where crimes occur. The algorithm then makes predictions of where crimes are likely to occur in the future based on the examples it has seen, and nothing else. Those predictions are used to inform decisions around where police should patrol, or where their presence may be the strongest crime deterrent.


HunchLab (left) and PredPol (right) user interfaces.
Image sources: https://tinyurl.com/2p8vbh7x (top), https://tinyurl.com/2u2u7cpu (bottom)

Algorithms like these lose their credibility because they base predictions of future crime on past police activity in an area. We know from years of research on the subject that minority and particularly Black communities in the United States are over-policed relative to their majority white counterparts [5]. For example, Black and white people are equally likely to possess or sell drugs in the United States, but Black people are arrested at a rate 3 to 5 times higher than whites nationally [6]. Policing trends like this one cause Black communities to be over-represented in police records. This makes them far more likely to appear as hot-spots for crime, when in reality they are hot-spots for police engagement.

Calls for service also do not represent the actual incidence of crime. Popular media has reported many examples in the last few years of police being called on Black Americans who are simply going about their lives – barbecuing at a park, sitting at Starbucks, or eating lunch [7]. Because of examples like this, presenting calls for service as a good and true representation of where crimes occur is misleading.

In short, predictive policing algorithms do not have minds of their own. They cannot remove bias from the data they are trained on. They cannot even identify bias in that data. They take as fact what we know to be the result of years of biased policing – that more crime happens in neighborhoods with more Black residents, and that less crime happens in majority white neighborhoods. This leads them to make predictions for future crimes that reproduce that bias. This is the idea of garbage in, garbage out: “If you produce something using poor quality materials, the thing that you produce will also be of poor quality” [8]. As these allegedly unbiased algorithms and those like them are increasingly used to make life-altering decisions, it is critically important to be aware of the ways that they can reproduce human bias. In this case, as with many, human bias is never removed from the process of making predictions; it is only made more difficult to see.

References
[1] Harvard Business Review Analytic Services. (2012). The Evolution of Decision Making: How Leading Organizations Are Adopting a Data-Driven Culture. Harvard Business Review. https://hbr.org/resources/pdfs/tools/17568_HBR_SAS%20Report_webview.pdf

[2] Lau, T. (2020, April 1). Predictive Policing Explained. Brennan Center for Justice. Retrieved March 10, 2022, from https://www.brennancenter.org/our-work/research-reports/predictive-policing-explained

[3] PredPol. (2018, September 30). Predictive Policing Technology. https://www.predpol.com/technology/

[4] Team Upturn. HunchLab — a product of Azavea · Predictive Policing. (n.d.). Team Upturn Gitbooks. https://teamupturn.gitbooks.io/predictive-policing/content/systems/hunchlab.html

[5] American Civil Liberties Union. (2020, December 11). ACLU News & Commentary. Retrieved March 10 2022 from https://www.aclu.org/news/criminal-law-reform/what-100-years-of-history-tells-us-about-racism-in-policing/

[6] Human Rights Watch. (2009, March 2). Decades of Disparity. Retrieved March 10, 2022, from https://www.hrw.org/report/2009/03/02/decades-disparity/drug-arrests-and-race-united-states#

[7] Hutchinson, B. (2018, October 20). From “BBQ Becky” to “Golfcart Gail,” list of unnecessary 911 calls made on blacks continues to grow. ABC News. Retrieved October 3, 2022, from https://abcnews.go.com/US/bbq-becky-golfcart-gail-list-unnecessary-911-calls/story?id=58584961

[8] The Free Dictionary by Farlex. (n.d.) garbage in, garbage out. Collins COBUILD Idioms Dictionary, 3rd ed. (2012). Retrieved March 10 2022 from https://idioms.thefreedictionary.com/garbage+in%2c+garbage+out