County DA pilots race-blind charging algorithm
Connecting state and local government leaders
Using software developed by the Stanford Computational Policy Lab, the Yolo County, Calif., District Attorney’s Office is studying whether its decisions to charge an individual with a crime are influenced by race.
The Yolo County, Calif., District Attorney’s (DA) Office embedded a redaction algorithm into its digital case management system to help it study racial bias in charging decisions.
Announced Sept. 9, the office has been using the algorithm, developed by the Stanford Computational Policy Lab (SCPL), since May as part of its race-blind charging program.
“The most important decision ever made in the criminal justice system is that decision on whether to charge somebody with a crime,” Yolo County DA Jeff Reisig said. “It’s the only stage in the entire criminal justice process where a prosecutor can truly make a race-blind decision because once you go to court, you know what the race is. Police officers know the race in the field. It’s really the only spot that you can put this kind of protective tool in.”
The office was the first in the state to go paperless, and Reisig said he has wanted to build technology to enable race-blind charging since he read about a pilot test SCPL did of its race-blind algorithm in 2019-20 in San Francisco. He reached out to SCPL to see if Yolo would be a good fit.
The lab found the county “particularly well set up to adopt the algorithm in their own case management system … which allows them to basically just incorporate our redaction algorithm inside of their software,” SCPL Executive Director Alex Chohlas-Wood said. “The other thing that was really important was the ability to get digital copies of the [police] narratives, so that we could actually run automated redaction on these narratives. Most other DAs get scanned PDFs or paper documents that wouldn’t allow us to do anything automatically.”
Narratives are the free-text portion of a police report that describes an incident. The police department provides the digital PDFs, and the system pulls the narratives into a text box to run the automation.
“We’re basically looking for any kind of written information that might indicate a suspect’s or a victim’s race,” Chohlas-Wood said. “These are things like somebody’s name -- whether it’s a suspect’s name, a victim’s name, a witness’s name -- basically any involved individual,” he said. “We redact their names, we redact any physical description like hair or eye color. We obviously redact skin tone descriptions or explicit mentions of race.” Geographic location is often an indicator of race, so that’s also redacted when possible.
Rather than replacing the text with a black box, the algorithm inserts a placeholder. For instance, instead of a name, it substitutes “Suspect 1.”
Yolo County then takes the following steps. First, after reading the redacted report, the deputy DA answers questions from the program, such as whether the redaction quality was good or poor, meaning words that would identify the suspect’s race were not redacted.
Second, the deputy DA states how likely it is that the case will be charged. Only after that decision does the office review the unredacted report, along with additional information – rap sheets and photographic, video or audio evidence – that had been withheld until this point because it’s harder to redact.
Lastly, if the deputy DA changes the charging decision after seeing the unredacted information, the program asks for an explanation for the switch. SCPL compiles the data from each case and analyzes it to determine whether bias factored into the charge.
To train the algorithm, SCPL uses historical data from San Francisco and Yolo County police departments, in addition to information from other sources such as the state highway patrol. The lab is offering the code for free.
Although SCPL is not releasing any results on its work with Yolo yet, Reisig said, “we’re not seeing any evidence that people are making a race-blind decision and then revealing the race in the secondary part of the process and changing their mind.”
That’s positive, but perhaps even more important is that just by using the algorithm, county residents can see the criminal justice system’s commitment to fairness.
“Our hope is we’re doing something to specifically address implicit bias or bias period in the charging process and, two, that we’re sending a strong message to the community that we take this seriously and there’s procedural justice in Yolo County when it comes to these decisions,” Reisig said. “For more of the prosecutors across America, I’m guessing if they implement this kind of technology, what they’re going to see is what I’m seeing -- and that’s that they haven’t had a real problem with bias in their charging decisions,” he said. “But the power of the procedural justice confidence-building is well worth doing it.”