Data-Based Decision-Making Is Flawed When the Data Is Flawed
Connecting state and local government leaders
There are many reasons the quality of state and local data can be poor. Using that information can lead to unfortunate results.
There’s nothing new about the importance of data to the smooth functioning of state and local government. But over the last few years, with the aid of advancing technology, phrases like “data-driven” have become ubiquitous and are used to make it appear that policies that fit in that category have some kind of magical seal of reliability.
In fact, "while it's a positive development that agencies are relying more on data," says Tracey Smith, associate director of Virginia's Joint Legislative Audit and Review Commission (JLARC), "that doesn't mean the data is actually good."
There may even be an inverse correlation between increased use of data and the likelihood that it is accurate, timely and useful.
“The more you rely on data, the more incentive there is for people to be careless with it, because they’re in a hurry,” says Mark Funkhouser, the former mayor of Kansas City and president of Funkhouser and Associates, a consulting firm that helps small and mid-size cities do strategic planning. “The boss wants it this afternoon, and the time pressure for government employees is great because they’re short-staffed.”
Whatever the reasons for faulty data, our regular review of performance audits in state and local governments reveal a staggering number in which data quality is partially at fault for a program’s problems.
A December 2022 JLARC study, for example, found that data weaknesses in the state’s psychiatric hospital bed registry “lacks real-time, useful information about the psychiatric beds available. Ninety-two percent of surveyed CSB [Community Services Board] staff with bed search responsibilities indicated that the bed registry was either not at all useful or not being used as part of their bed search process.”
This is important because the registry is intended to help people in dire need of treatment in psychiatric units receive that treatment quickly. The state limits the amount of time people can be held and await treatment without their own permission, and because of incomplete and unreliable data on psychiatric hospital bed availability, some individuals may be released before it’s safe for them or others.
Then there was a June 2021 report out of Berkeley, California, that found that problems with data were one significant reason the city’s fleet replacement fund was short millions of dollars. Jenny Wong, the city’s auditor, learned that money in the fund was not enough to replace vehicles and equipment overdue for replacement. If the fund, “is not sufficient to replace vehicles and equipment on time, it can cost the city more in the long run due to the excess maintenance and repair costs to keep an aging fleet running,” she wrote.
The fund was short in large part because, “Though the department of public works was responsible for keeping track of accurate replacement dates for vehicles, it wasn’t doing so,” says Wong. “As a result, if a vehicle broke down, there was no assurance that there would be enough money in the fund to purchase a new one, and then you might have to go to the city council to get the necessary funding.”
“What’s more, some vehicles in the fund were replaced out of the fleet fund. But the old vehicles weren’t necessarily retired, and the data didn’t reflect that, which resulted in ‘fleet creep,’” according to Wong.
Similarly, a December 2022 Vermont audit found data that wasn’t being recorded. The audit reviewed the Department of Corrections process for receiving and responding to prisoner grievances. Overall, it stated, “the record-keeping system that DOC uses to collect information on grievances—the Offender Management System (OMS)—does not have reliable, basic information to determine the number, type, status, or outcome of prisoner grievances.”
In other words, the data in OMS is inaccurate or otherwise unusable. It does not contain the dates that complaints are submitted or when staff respond to the prisoner or take action to resolve issues, “so it is not possible to determine the extent that the department is meeting its own timeframes for responding to grievances,” says the state’s auditor, Doug Hoffer.
This is one of many instances in which the absence of quality data isn’t just a problem with a bunch of numbers, but one where the security and safety of real human beings is at stake as grievances that need addressing can easily be ignored. That, in turn, has further ramifications. “To the extent that a prisoner believes the serious complaints are taken seriously, there’s likely going to be less tension,” says Hoffer. “But the administration has not yet taken this seriously, or we would not have found what we found.”
Old legacy systems can further complicate how data is used. As Bill Kehoe, chief information officer of Washington state explains, “Even accurate data, when it’s old, can be easily misunderstood [and] result in inaccurate reporting.” That’s because data dictionaries, which clarify the meaning of each piece of data, are often not created, and when the people who created the data or really understand it retire, “it can become increasingly difficult to make sure it’s not being misused.”
Even when the software is fully up to date and staff is available to ensure its accuracy, there are instances in which the data provided for decision-makers is misrepresented in order to make a case for a particular decision.
Consider economic development. In Vermont, the flagship business incentive program is based on the “but for” way of looking at things. Which is to say that, “But for the incentive, the new jobs or economic benefits would not have emerged.”
But as Hoffer points out, even though companies can make whatever claims they like about the importance of the economic development dollars they’ve received, “A major study shows that at least 75 percent of all the businesses would have made the same decisions, without the economic development dollars.”
Says Marcus Stubblefield, criminal justice and policy strategy section manager for King County, Washington, “Data is relative. I can get data that proves anything I want to say. That doesn’t mean it’s good data.
Data can also be manipulated to give false testimony. For instance, depending on how it’s presented in visual form can be misleading. In a laudable effort to help simplify access to data, visualization is an increasingly common buzzword. But charts and graphs can give the veneer of accuracy and simplicity, yet still be entirely misleading.
One example from Doug Jones, the Kansas City, Missouri, auditor is that when the scale of a chart changes, the appearance of the data can communicate entirely different messages. A chart that shows numbers on a scale of 1 to 100, for example, may make a 3-5% change seem tiny. But if the chart only goes up to 30, that 3-5% change can look extremely dramatic.
“Sometimes people may change the scale of a graph because they’re trying to show something specific,” he says, “so the main thing is that when the top or the bottom of a chart is changed, people need to explain why they’ve done it.”
NEXT STORY: Look for Fraudsters to Apply COVID-era Tricks to New Programs