‘Early days’ for state-level response to Biden’s AI executive order
Connecting state and local government leaders
Experts urged state and local governments to be patient as it plays out at the federal level, pointing to an executive order from a decade ago that offers a cautionary tale.
President Joe Biden’s executive order and guidance on the federal government’s safe use of artificial intelligence is a good starting point for state and local governments, but observers urged caution, given the uncertainty around the tech and its impacts on government operations.
At the very least, don’t rush out and hire a chief AI officer, said Abhi Nemani, senior vice president of product strategy at software company Euna Solutions.
“Your first impulse is going to be to hire a chief AI officer,” Nemani said of some state governments. “Please push back against that impulse. Hold strong on waiting to see how the policy landscape emerges, and where the technology is going.”
Biden’s sweeping executive order last week, and the subsequent guidance to federal agencies from the Office of Management and Budget looks to bring the “power of the federal government to bear,” a senior administration official said, on managing the risks and benefits associated with AI. It also puts the onus on individual federal agencies to hire chief AI officers and add risk management policies.
Some observers are already optimistic about the potential impacts for other levels of government. In an email, Alexandra Reeve Givens, CEO of the digital rights and liberties nonprofit Center for Democracy & Technology, said the OMB guidance to federal agencies “sets a model for state and local governments to follow."
But Nemani said the order “didn't really speak to state and local government,” although it alluded to some of the services delivered at that level, like benefits and human services. Instead, he advised states and localities to take a wait-and-see approach.
“I think it's early days yet to see both what tools take prevalence, what tools are useful and how they are actually used,” he said.
Nemani drew a comparison with an executive order issued by the Obama administration in 2013 that committed the federal government to various open data initiatives in a bid to make the government more efficient and transparent. That order prompted a stampede among state and local governments to replicate it, something Nemani said he witnessed firsthand as he helped launch the nonprofit Code for America.
But in their rush to embrace open data, many states and localities did not have the staff or the expertise to take advantage of the new technology, and so several efforts fell flat. It has taken years—and a pandemic—for some states to start to truly realize open data’s potential, with more work still ahead. And just as state and local agencies struggled to hire data analysts and scientists a decade ago, Nemani thinks it could be challenging to hire for AI roles as well.
“At the local level, it's harder because cities can't hire bodies for every department,” he said. “Are we going to hire AI chiefs for every department in every city that report to a chief AI officer across the whole board? That's going to be hard to do.”
States are already well underway with their own AI initiatives, something that has accelerated in the last year as IT and elected leaders alike look to better understand the technology. Indeed, a recent survey by the National Association of State Chief Information Officers found that 53% said generative AI will be the most impactful emerging IT area in the next three to five years, with another 20% citing AI and machine learning more broadly.
Alex Whitaker, NASCIO’s director of government affairs, said in an email that the federal government could “look to the many actions states have already taken to formulate effective and sensible AI policy,” especially those who have taken a proactive approach. The National Conference of State Legislatures found that AI-related bills were introduced in at least 25 states and territories this year.
Localities have also been active in experimenting with the emerging tech and in crafting policies and guidance. The National Association of Counties’ Artificial Intelligence Exploratory Committee is set to release an interim report on AI’s challenges and opportunities for county governments in February, as well as a survey to examine public sentiment and the perspectives of leaders.
In a statement released by NACo, Travis County, Texas, Judge Andy Brown said the group welcomes “the ongoing dialogue with our intergovernmental partners to build an approach that keeps our global competitive edge with AI and balances all the complex privacy, workforce and security issues.”
In the immediate aftermath of Biden’s order, Nemani called on state and local governments to “ask hard questions when looking at purchasing software,” including about its ethical considerations, data use, privacy measures and whether it can be used in bad ways. “These are questions we have to ask of any piece of software,” he said. “With AI, it's even more urgent that we do so.”
Nemani suggested that state and municipal elected officials be proactive and engage with the vendor community and developers by establishing an advisory council, task force or similar group to understand the landscape and how it can impact their constituents.
“Go to the people who are actually doing the work today, and ask them to think about how AI and those tools can help them and what people or resources or training they need to be more effective,” Nemani said.
Matthew Thompson, senior vice president and general manager of public sector at identity management company Socure, said while he acknowledged it is a “utopian view,” in general, governments must work harder to stay ahead of new technologies to avoid being left behind and ensure policy does not stand in the way of progress. “The speed of government needs to adapt,” Thompson said in an interview at NASCIO’s annual meeting last month.
NEXT STORY: As industry struggles, federal, state offshore wind goals could get tougher to meet