Taking Race Out of Criminal Charges
Connecting state and local government leaders
Prosecutors in Yolo County, California, are attempting “race-blind charging” by using software to redact identifying information, including race and other descriptors, from police reports.
Prosecutor Carolyn Palumbo scrolled through a police report to determine whether to file charges against the defendants, a normal part of her job as a deputy district attorney in Yolo County, California. But this report—a shoplifting case involving two suspects—looked a little different.
“In this narrative, there is no identifying information,” Palumbo told reporters at a press conference in September. “I know the market is in Davis, but I don’t know the specific location. It says the officer detained ‘defendant 1,’ but I don’t know the name, race or if the defendant is male or female.”
The redactions were intentional, scrubbed by “race-blind charging” software that aims to remove implicit bias from a key stage of the criminal justice process. The software, developed by researchers at the Stanford Computational Policy Lab, has been used by the district attorney’s office since May in partnership with two police departments, one in Davis and the other in West Sacramento. It is now, said District Attorney Jeff Reisig, “part of the DNA” of case management in his office.
“We’re all in on this,” he said. “If a prosecutor makes the decision to charge, that’s where the formal initiation of criminal proceedings begins. It is the most critical phase of the criminal justice system, so we feel like this is a critical time to use it.”
It’s also perhaps the only step in the judicial process where it’s possible to remove the issue of race, Reisig said. Police officers responding to a scene immediately know the race of the people involved, and judges and juries can instantly see whether a defendant is a person of color.
“But prosecutors—we’re looking at paperwork,” he said. “The only time where you can truly implement a race-blind process is at the charging phase.”
The software, provided to the office for free, takes police reports from participating law enforcement agencies and redacts the identifying details of the people and locations in each incident. Prosecutors review each report, then tell the system via a drop-down menu whether it’s a good redaction (no identifying details) or a bad one (where a name, neighborhood or other detail slipped past the algorithm).
The program excludes sex crimes, domestic violence cases and homicides, all of which often rely on a suspect’s criminal history to determine whether charges are warranted.
Incomplete redactions are removed from the race-blind charging program, said Jonathan Raven, chief deputy district attorney for Yolo County. If the redaction is clean and the prosecutor decides to press charges based on the information provided, they’re given access to the full police report.
“Then they make another charging decision,” Raven said. “And if the decision has changed based on the new report, they have to provide a reason, which we review to determine whether it’s race neutral.”
Changing decisions is not unheard of and does not necessarily imply bias. Sometimes a defendant’s race or physical characteristics line up with a witness description or footage from a security camera, or their full name is attached to a criminal record filled with similar incidents that can be used in court to prove guilt.
“But 99% of the time, we don’t need to look at any of that in order to make a decision based on the police narrative,” Reisig said.
The Stanford Computational Policy Lab will eventually review data from the pilot program to help determine whether, or how much, implicit bias is influencing charging decisions by county prosecutors. County officials will have access to that information, with the ability to drill down to determine whether individual attorneys have patterns of bias. Consequences would vary depending on the scope of the problem, Reisig said.
“It could be a sit-down with that attorney to evaluate their decision making, it could be additional training, or it could be something more significant,” he said “We have to get the data to really figure out if they’re being affected by implicit bias, or, God forbid, explicit bias—but either way the idea is that we’re going to root out the problem, fix it and get rid of it.”
Preliminary results don’t indicate a problem, though the office won’t have official numbers until at least next spring. Even if the data show no evidence of bias, Reisig said, the office will continue to use the system indefinitely.
“If the numbers are low, which I think they’re going to be, then hallelujah, that’s good news,” he said. “But nonetheless, it is addressing a public perception that is real to people: That the criminal justice system is broken, and that it is systemically biased. So regardless of the results, the tool—built and evaluated by a third party—serves the purpose of building public confidence in the integrity of the process.”
Nationwide Deployment Sought
The software is open source, and the policy lab hopes to deploy it nationwide, though Yolo County is so far the only office testing it. The program there is backed by the office’s Multi-Cultural Community Council, a citizen advisory board that meets regularly with Reisig to discuss reform efforts.
“If we are honest, we as humans still struggle to look past our physical identities like race, and we know that color can color the perceptions of what is true, what is just and what is equitable,” Tessa Smith, the council’s chairwoman, said at the press conference. “We’re all working on it. But until we get it down pat in our criminal justice system—not just here in Yolo County but in every county across the country—we could do well to embed this instrument into our case management system.”
Kate Elizabeth Queram is a senior reporter for Route Fifty and is based in Washington, D.C.
NEXT STORY: New Survey Finds Differences in How Local Agencies Adopt Software