How to Fairly Use Algorithms to Make Tough Decisions
Connecting state and local government leaders
COMMENTARY | With computer power increasingly used to guide policies adopted by states and local leaders, governments need to take steps to ensure the underlying algorithms aren’t biased.
The power of data to help states and localities make decisions more effectively has long been a given. But the use of complex algorithms in government complicates that picture, raising essential questions about whether the data being used is free from the biases and inequities so often present in our communities.
What exactly are algorithms? Dan Chenok, executive director of the IBM Center for the Business of Government, defines them as the complex equations built into a computer system’s software, “which enables interpretation of data in a much more powerful way than people could on their own.”
There have been discussions for some time involving the question of how to weigh the potential promises of their use against the potential risks of hidden built-in biases. But the debate came into sharp relief after Election Day, when Californians voted against Ballot Measure 25, which would have replaced the longstanding practice of judges setting bail according to a general set of guidelines with a system of “pre-trial risk assessments.”
The notion was to use a variety of factors to create an algorithm that would help a judge determine whether a person should either be released or kept in jail, based largely on the degree to which their release represented a threat to society. This would have eliminated cash bail entirely and save accused individuals the money they typically pay to bail bondsmen to provide the cash necessary to be released.
“These particular algorithms would look at the same variables, the same information, for all people and weight those variables so they can be scored,” says Heather Harris, research fellow at the Public Policy Institute of California.
The bail industry fought back, launching a campaign against the measure. But other critics also emerged from the civil rights community, with opponents raising concerns about algorithms used in areas that have direct impact on human lives, such as bail, parole and sentencing. These kinds of concerns aren’t new. As long as five years ago, ProPublica did a deep dig into a number of issues with algorithms and found, for example, that in Broward County, Florida, which was using them for sentencing, “the formula was particularly likely to falsely flag Black defendants as future criminals, wrongly labeling them this way at about twice the rate of white defendants.”
Josh Martin, the chief data officer in Indiana, says states need to consider these dangers when building and using algorithms. “There have been numerous case studies that highlight how using race in machine-learning algorithms can lead to biases in unrepresentative training data or the unconscious/conscious biases held by the people writing and tuning the algorithms,” Martin writes in an email.
Indiana has been particularly successful in using algorithms in the realms of education and workforce development. For example, they are used to determine the education pathways from high school graduation through higher education that lead to the highest wage outcomes for residents. This information is used to help students in the state understand college costs, how to borrow wisely, and what to expect from the choices they make—for example, whether to pursue an associate, bachelor or more elevated degree and how the choice of study will affect future earnings.
Based on our research, the key is to recognize the risks that algorithms present and to be sure that they are used in a transparent way. This means making public the variables used and the weightings applied to them.
This kind of transparency is particularly important when decisions are being made in areas like criminal justice. Things get much easier in fields like transportation. Atlanta, for example, saw a 25% reduction in the crash rate along a 2.3 mile stretch known as the North Avenue Corridor after it piloted a data application to analyze key risk factors. This allowed city engineers to predict crashes before they happened and take steps to avoid them, such as adjusting the timing of traffic lights.
A couple of localities stand out for their use of great care in using algorithms: Cook County, Illinois, the second most populous county in the United States,7 and New York, the nation’s largest city.
Says Dessa Gypalo, Cook County’s chief data officer, “Data analytics has typically been opaque. It’s been a black box. But everything we do has to be done transparently and with caution.”
One example she cites was the use of algorithms to help parcel out the money from the federal government’s CARES Act, intended to provide economic assistance for workers, families, businesses, and local governments to help them get through the coronavirus-created economic downturn. This was a particularly tricky endeavor in Cook County, which includes the city of Chicago and more than 130 other municipalities.
The algorithm used included five data points: the population of each municipality; median income; Covid-19 deaths per 100,000 people; tax base per capita; and percentage of the population located in an economically disconnected or disinvested area. The last data point was developed by municipal partner CMAP—Chicago Metropolitan Agency for Planning—in an effort to quantify inequitable regional investment practices over the years. Using these variables, it was able to create a score for each municipality and then base funding allocations on that data point.
But here’s the important part: The county didn’t stop there. It ran the computerized results through a rigorous evaluation process that included a great many conversations with people with longstanding institutional knowledge about the differences in the municipalities in Cook County. “We are so large, and we have a lot of different stakeholders,” says Gypalo, “and they were able to validate the findings.” The input from knowledgeable human beings led to a few tweaks, and Gypalo felt comfortable that the computerized recommendations weren’t just mathematically defensible, but fit in with real world-considerations, as understood by human beings.
In New York City, on November 19th, 2019, Mayor Bill de Blasio signed an executive order establishing the position of Algorithms Management and Policy Officer, (AMPO) in his city. As the mayor said that day, the position was established to “ensure the tools we use to make decisions are fair and transparent.”
Such was the importance of this new effort that the mayor appointed Jeff Thamkittikasem, his director of operations, to be acting AMPO for the city.
Thamkittikasem is abundantly aware of the risks of bias that algorithms can present. “Systemic racism exists,” he says, “and sometimes it can come into tools like algorithms, which are often based on previously gathered data . . . If you’re using bad data, you’ll get bad outcomes.”
The first step the city is taking is going back to algorithms that are already in use. By January, New York City will have the first published list of the systems that qualify for review. Then the agencies will carefully assess these tools to ascertain that “there isn’t any bias or disproportionality in how they were developed,” says Thamkittikasem.
One of Thamkittikasem’s goals is to empower the individual agencies to address their own use of the algorithms they’re putting in place by setting policies and practices that the agency leaders all understand. He’s savvy enough about the workings of government to know that different agencies will need to use different tools. “Human services are different than financial practices,” he says.
Will the kind of care New York City and Cook County are taking allow algorithms to be used in the complicated areas of criminal justice? That’s hard to say. One observer noted in mounting an argument that algorithms can be justified as equally as fair as judges in making decisions, that “judges are also black boxes.”
Ultimately, wherever algorithms are used, Cook County’s Gypalo makes a strong point by quoting Uncle Ben from Marvel Comics’ Spider-Man, “With great power comes great responsibility.”
Katherine Barrett and Richard Greene of Barrett and Greene, Inc. are columnists and senior advisers to Route Fifty.
NEXT STORY: Route Fifty’s Parent Company Buys Community Platform for Local Government Leaders