Time for an Update: How States Can Improve IT Systems to Use Data More Effectively
Connecting state and local government leaders
COMMENTARY | State governments have data, but aren’t using it well. A strategic approach is needed to rebuild their data infrastructure.
Earlier this year, when governors delivered their state of the state addresses, they barely mentioned data. As we approach 2021, we can expect a very different story. Whether it’s a Covid-19 update, economic outlook or school reopening plan, data is now a regular part of governors’ messaging.
A major sign of the importance of data is the growing number of dedicated staff in charge of governors’ data efforts. In the last year, the number of states with chief data officer positions has grown from 20 to 29. But even with increased leadership, bringing together government data to improve services when the benefits are so clear for both pandemic response and everyday life still poses a significant challenge. Why?
In short, government data technology systems aren’t sufficient to handle the new demands imposed by the pandemic. As Rhode Island found during the initial stages of the pandemic, its decades-old mainframe system simply couldn’t handle the surge of new unemployment applicants. They aren’t alone: a 2020 survey from the National Association of State Chief Information Officers identified improving digital services and legacy modernization as a top priority.
Current government data infrastructure isn’t up to the task primarily because systems are old, application capacity is limited, reporting timelines are too slow, data format is inconsistent and multiple databases capture similar information. For example, across the nation, unemployment and motor vehicle databases are decades old, resulting in slow application and processing times for even the simplest tasks. Similarly, state child welfare data systems are also built using outdated practices. The challenge with these antiquated systems is that they aren’t able to accommodate the increased need for data to inform policymaking and disaster response efforts. As data needs have changed over time, they haven’t been accompanied by the requisite technology investments to implement them. The result is state agencies are forced to find Band- Aid solutions or workarounds to accommodate new requirements, causing the data in these systems to suffer.
Inadequate data systems also mean that the timelines for getting updated information are a relic of another era. In the past, a quarterly report on hospital ICU capacity might have been sufficient. But these monthly, quarterly, or even annual, reporting intervals don’t allow for the kind of rapid decision making that a public health emergency requires. In 2020, decision-makers want information daily, or even hourly. Even in cases where states have modernized these legacy systems, they are often managed by third party vendors, and generally not designed to enable rapid access to the data within them. This requires states to rely on those vendors to extract data, often resulting in delays or even additional costs.
These challenges are significant but not insurmountable. Several states are already working to change the status quo. For example, Colorado’s chief data officer has been leading an effort called the Joint Agency Interoperability project, which aims to stitch together both legacy and modern data systems. The plan is to create application programming interfaces that enable frontline workers to understand how individuals are interacting with multiple services and allow for this data to be aggregated so it can be analyzed for population level trends. Using this approach, data access can be prioritized from the outset and increase state governments’ ability to access data to inform decisions. Connecticut has taken a more prospective approach by requiring that technology investments are consistent with the state data plan, ensuring that data access and sharing are built into any future IT procurements.
Colorado and New Jersey also have created digital service teams, which are modeled after the U.S Digital Service and 18F, which have played an important role in modernizing federal systems. These teams are taking modern approaches to developing technology to support the efforts of chief data officers to make systems more nimble. This dedicated human capital can also help states adapt their data systems more quickly in moments of crisis.
None of these solutions deliver results overnight, but they are part of an iterative process of improvement where successes built upon themselves. Designing, and appropriately staffing, a long-term strategy to connect and modernize data systems is a smart first step for every state government. By adopting this strategic approach, states can break down silos, increase data use and better serve the public.
Tyler Kleykamp is the director of the State Chief Data Officers Network. Jed Herrmann is vice president for state and federal policy implementation at Results for America.
NEXT STORY: Republican AGs Sue Google, Alleging Uncompetitive Conduct With Online Ads