California Just Replaced Cash Bail With Algorithms
Connecting state and local government leaders
Starting in October 2019, an algorithm will grade pretrial defendants to guide whether they should be released before trial.
Instead of leaving cash as collateral for freedom before a trial in court, those accused of crimes in California will be graded by an algorithm, starting in October 2019. A county official will then take that grade and use it to recommend whether the accused should be released or remain in prison.
California governor Jerry Brown signed SB 10 into law last week, a bill that replaces cash bail with an algorithmic system. Each county will have to put in place a system to ascertain a suspect’s risk of flight or committing another crime during the trial process, whether that means using a system from a third-party contractor or developing one themselves before the October 2019 deadline.
Activists have railed against cash bail for years, which should have made this legislation a sweeping win for civil rights organizations. Their argument is that cash bail makes justice an uneven playing field, incarcerating the poor while allowing those with money or assets avoid jail time.
But algorithmic risk assessment makes the details of how this new legislation is implemented all the more important—while some activists say that cash bail created a predatory industry around keeping people tied to the criminal justice system, others point to the biases against people of color documented in algorithmic risk assessment tools used in the past.
“It is a tremendous setback,” Raj Jayadev, a coordinator at civil rights activist organization Silicon Valley Debug, told Quartz. “This will, in our analysis, lead to an increase in pretrial detention.”
That’s because the machine learning systems used to calculate these risk scores throughout the criminal justice system, have been shown to hold severe racial biases, scoring people of color more likely to commit future crimes. Furthermore, since private companies have been typically contracted to offer these services, the formulas derived by machine learning algorithms to calculate these scores are generally withheld as intellectually property that would tip competitors to the company’s technology.
Jessica Bartholow, policy advocate for the Western Center Law on Law & Poverty, which co-sponsored the bill, says that algorithmic risk assessments have always been on the table since the bill’s inception, and civil rights organizations have overwhelmingly supported the move away from cash bail. She says—in contrast to Jayadev’s predictions—that fewer people will be kept incarcerated, including poor people who would have previously been forced to pay bail or associated court fees or stay in prison.
Now that the bill has passed, it’s a matter of working with policymakers to ensure that the data and methodology of risk assessment is fair and transparent, Bartholow says. For instance, there will already be a review in 2023 that checks the new system for bias—however four years of holding people in jail before their trial is still a long time.
Members of California’s state congress agree that there are still details to be worked out: State senator Robert Hertzberg has already committed to introducing new legislation that will ensure transparency of risk assessment tools, his office confirmed to Quartz. In a state congressional hearing in August, the senate budget chair Holly Mitchell agreed to co-author the legislation.
Rashida Richardson, policy director for AI Now, a nonprofit think tank dedicated to studying the societal impact of AI, tells Quartz that she’s skeptical that this system will be less biased than its predecessors.
“A lot of these criminal justice algorithmic-based systems are relying on data collected through the criminal justice system. Police officers, probation officers, judges, none of the actors in the criminal justice system are data scientists,” Richardson said. “So you have data collection that’s flawed with a lot of the same biases as the criminal justice system.”
But there has been some movement towards openness in the data and factors used: Jayadev says that activist pushback has forced some companies to start disclosing which factors are considered when calculating risk scores—like a suspect’s previous criminal history and severity of the crimes alleged—but not how much each weighs into the final risk score. When some risk assessment algorithms weigh more than 100 factors (pdf), a lack of transparency makes it difficult to tell if the factors that an algorithm considers influential can relate to race, gender, or any other protected demographic.
Until the review in 2023, it will be difficult to know the real impact of the bill.
“If we end up with a black box for an assessment tool, and we’re not able to run the data, and we’re not able to make changes swiftly to a computer system, then we’re surrendering to an algorithm that we can’t change or really know,” Bartholow said. ”One thing I would say, though, is poverty is the biggest algorithm of them all—who is born with money and wealth and access to it and who is not—and today the money bail system surrenders people to that.”
Dave Gershgorn is a reporter at Quartz, where this article was originally published.
NEXT STORY: Appeals Court Finds That Anti-Camping Ordinance Violates Eighth Amendment Protections