When an Algorithm Replaces Cash Bail

When an Algorithm Replaces Cash Bail
Allison Godfrey
October 9th, 2020

In an effort to make the criminal justice system more equitable, California Senate Bill 10 replaced cash bail with a predictive algorithm producing a risk assessment score to determine if the accused needs to remain in jail before their trial. The risk assessment places suspects into low, medium, or high risk categories. Low risk individuals are generally released before trial, while high risk individuals remain in jail. In cases with medium risk individuals, the judge has much more discretion in determining their placement before trial and conditions of release. This bill also releases all suspects charged with a misdemeanor without needing a risk assessment. This bill was signed into law in 2018 and effective in October 2019. California Proposition 25 seeks to repeal this bill and return to cash bail on the basis that this algorithm biases the system even more than cash bail. People often see data and algorithms as purely objective, since they are based on numbers and formulas. However, they are often “black box” models where we have no way of knowing exactly how the algorithm arrived at the output. If we cannot follow the model’s logic, we have no way of identifying and modifying its bias.


Image from this article

By the nature of predictive algorithms, they learn from the data in much of the same way as humans learn from their life’s inputs (experiences, conversations, schooling, family, etc). Our life experiences make us inherently biased since we hold a unique perspective purely shaped by this set of experiences. Similarly, algorithms learn from the data we feed into them and spit out the perspective that the data creates: an inherently biased perspective. Say, for example, we feed a predictive model some data about 1,000 people with pending trials. While the Senate Bill is not clear on the exact inputs to the model, say we feed the model the following attributes of each person: age, gender, charge, past record, income, zip code, and education level. We exclude the person’s race from the model in an effort to eliminate racial bias. But, have we really eliminated racial bias?


Image from this article

Let’s compare two people: Fred and Marc. Fred and Marc have the exact same charge, identify as the same gender, have similar incomes, both have bachelor’s degrees, but live in different towns. The model learns from past data that people from Fred’s zip code are generally more likely to commit another crime than people from Marc’s zip code. Thus, Fred receives a higher risk score than does Marc and he awaits his trial in jail while Marc is allowed to go home. Due to the history and continuation of systemic racism in the country, neighborhoods are often racially and economically segregated, so people from one zip code may be much more likely to be people of color and lower income than those from their neighboring town. Thus, by including an attribute like zipcode, we are introducing economic and racial bias into the model even if these additional attributes are not explicitly stated. While the original goal of Senate Bill 10 was to eliminate the ability for wealth to be a determining factor in bail decisions, it inadvertently reintroduces wealth as a predictor in the algorithm through the economic bias that is woven into it. Instead of equalizing the scale in the criminal justice system, the algorithm tips the scale even further.


Image from this article

Additionally, the purpose of cash bail is to ensure the accused will show up to their trial. While it is true that the system of cash bail can be economically inequitable, the algorithm does not seem to be addressing the primary purpose of bail. There is no part of Senate Bill 10 that helps ensure that the accused will be present at their trial.

Lastly, Senate Bill 10 allows judge discretion for any case, particularly medium risk cases. Human bias in the courtroom has historically played a big role in the inequality of our justice system today. The level of discretion the judge has to overrule the risk assessment score could re-introduce the human bias the model partly seeks to avoid. It has been shown that judges exercise this power more often to place someone in jail than they do to release them. In the time of Covid-19, going to jail has an increased risk of infection. With this heightened risk of jail, our decision system, whether that be algorithmic, monetary, and/or human centered, should err more on the side of release, not detainment.

The fundamental question is one that neither cash bail nor algorithms can answer:
How do we eliminate wealth as a determining factor in the justice system while also not introducing other biases and thus perpetuating systemic racism in the courtroom?