Government employees need hands-on, standardized AI training
Connecting state and local government leaders
Having a properly defined list of terms helps get everybody on the same page, as does baseline training for every employee.
Hiring a chief artificial intelligence officer may seem like a top priority for state and local leaders as they look to implement the technology across government, but training everyday employees on AI is perhaps a more pressing concern.
Already, states and cities are experimenting with the technology, deploying generative AI to automate repetitive tasks, to power call centers and 311 lines, to reroute buses to where demand is greatest, to solve murders and other violent crimes, and to speed up the processing of housing vouchers, among other uses.
AI has enormous potential to make government more efficient and revamp service delivery. It could have a big impact on employees’ productivity, too. A recent report from Deloitte estimated that generative AI could help boost productivity tenfold, alongside shifts in digital infrastructure, breaking down silos and sharing data better. All those technological, policy and process improvements represent a “synchronous dance.”
But without a trained workforce, many of the state and local experiments and efficiency gains risk falling flat. Public sector employees not only need reskilling to deal with the massive amounts of data generative AI relies on, but also training to use the technology properly and ethically.
To get this training in place, though, everyone needs to be speaking the same language. As it is, states and the federal government, which has its own guidance on how to apply AI and develop an AI-ready workforce, each have their own terminology and definitions around the technology.
The desire for standardized, shared terminology and definitions was one of the major findings from a series of listening sessions hosted by InnovateUS, an initiative of the Burnes Center for Social Change and GovLab and which trains public sector employees on digital tools. The goal of the sessions was to “to gather input on the development of a new philanthropically-funded, national online training program on responsible use of artificial intelligence (AI) for public sector workers.”
InnovateUS talked with more than 100 AI leaders over several months and found that a majority wanted AI trainings to consist of a “basic overview” of AI and a “plain language explanation” of how it works, including definitions and terminology.
Multiple people also called for the trainings to be hands-on, rather than having employees sit and listen to multiple presentations. InnovateUS said they heard interest in a “modular, learn-at-your-own-pace asynchronous online course,” with hands-on exercises interwoven to allow people to experiment with the generative AI tools available online.
Koma Gandy, vice president of leadership and business at Skillsoft, said training should be “multimodal,” consisting of videos and other educational materials, as well as a “sandbox” to allow people to experiment with AI in a controlled environment.
Doing practical training helps keep employees engaged, Gandy added, rather than more staid, traditional training.
“It's important that it's not just a bunch of PowerPoint slides with a bunch of words on them,” she said. “It's, ‘Oh, I actually know what prompt engineering is because I tried it out, and then I solved what the output was, and I can figure out how I can use this in my job role.’ This is something that feels safe, and it feels like I understand how to use it, and I'm not afraid of it anymore.”
Within that sandbox, experts agreed that having employees train on use cases relevant to their jobs is helpful, too. Having, for example, workers who are responsible for administering public benefits programs experiment with what AI can do in that area in a controlled, non-human facing environment would help show what is possible.
“Make it real, you let them test it out, you put them into a generative AI sandbox, and you let them play around with it and look at what they can do with it,” said Bill Eggers, the executive director of Deloitte’s Center for Government Insights. “This is what high school students, junior high school students and college students are doing every day now. If you look at that generation, they are the least afraid of the impacts of generative AI because they're using it all the time and they think it's going to enhance their life and enhance their career.”
“I think the practical experience will make the outcomes [of training] richer … even if you have to do it in a sandbox of sorts,” one industry expert told InnovateUS. “Fingers on keyboard—I think it is the difference between success and not being successful. It is basically essential that they actually put fingers to keys.”
Cristin Dorgelo, a visiting fellow at InnovateUS who also helped coordinate its listening sessions, said she heard that every government employee should receive “baseline” training in AI. Gandy, meanwhile, said tailoring even that baseline training will be key, as it should meet employees “where they are.”
InnovateUS found interest in training that prioritizes data management, understanding the types of use cases available, how to mitigate risk and bias, and how agencies can improve their organizational readiness and change management.
Users also must learn the cybersecurity implications of the AI technology they are being trained on. California Chief Information Security Officer Vitaliy Panych said during the recent Billington State and Local Cybersecurity Summit in Washington, D.C., that developing such a curriculum means people will be “risk-minded and risk aware” even when they casually use AI tools.
It may be intimidating for less technologically savvy employees to be trained on AI and may give leaders pause as it could require investing additional resources to help them. But there are creative alternatives, including what Eggers described as “reverse mentoring,” where younger or more confident employees help their colleagues get up to speed.
As well as providing educational opportunities for older employees, Eggers said it is a way of “bringing younger people up” and giving them “a lot of responsibility.”
While it may be tempting, recruiting new people into government roles cannot be the only solution, Gandy said. It must be balanced with making sure existing employees are well-trained.
“There's no hidden fountain of AI talent that's just going to emerge, it really is about investing in your people,” she said. “Part of investing in your people is investing in a strategy to make sure that your people have the opportunity to avail themselves of education where and how they need it, and relevant education to what their roles are and how they would expect to make changes in their workforce.”
Chief AI officers and other agency leaders will come in handy, Gandy said, as having their buy-in on using AI tools more frequently will encourage regular workers to use them, too. But those leaders must accept they “don’t know everything” about AI, show their “curiosity and interest” in the technology and identify pilot use cases.
“Once you've got that use case,” Gandy said, “that will whet people's appetites and also expose them in a controlled way to how artificial intelligence could be expanded and deployed.”
NEXT STORY: Why so many election officials are leaving