Democratizing AI for agencies
Connecting state and local government leaders
With Microsoft Cognitive Services, agencies can beef up their applications with emotion and video detection; facial, speech and vision recognition; and speech and language understanding.
Agencies that want to incorporate artificial intelligence into their apps can take advantage of the five different categories of Cognitive Services now available on Microsoft’s Azure Government Cloud.
With the application programming interfaces and software development kits in Microsoft Cognitive Services, agencies can add emotion and video detection; facial, speech and vision recognition; and speech and language understanding to applications.
To help agencies serve constituents with hearing disabilities, for example, an API can convert speech to text, help identify speakers and understand the intent in the tone of their voices. Language apps can provide multilingual services that help governments respond interactively to citizen requests, and knowledge and search apps can be used by bots or virtual assistants to access data from the web to answer questions.
Chatbots or virtual assistants have piqued the interest of agencies that want to use them to respond to constituent requests 24 hours a day, lessening the burden on call centers that answer many of the same questions.
“Through the bot framework, people can type something into a box, and behind the scenes it is doing a lookup and correctly respond[ing] in the way that we told it to,” Ken Hausman, a data platform solution architect at Microsoft, said during the Azure Government DC user community’s March Meetup on March 29.
“It is generally going to respond back using a service called LUIS [Language Understanding Intelligence Service] under the hood,” he said.
To get some discipline around the introduction of these new technologies, the General Services Administration is conducting its first pilot program using virtual assistants produced by Microsoft, Amazon and Google to respond citizen requests.
The goal of the pilot is to see how citizens are accessing public services in ways that are not completely dependent on the websites themselves, Justin Herman, who leads the agency’s Emerging Citizen Technology Program Office, said at the same event.
Since AI bot technology is rapidly changing, each of the pilot programs will be implemented quickly and will only last a month. This rapid, agile development will give the agencies and the public an opportunity to view the results and build on the findings.
“We don’t want the paradox of the present haunting us where people are just bound by what they see now,” Herman said. “We need to look three steps ahead.”
Herman said he also sees possibilities in programs like Azure’s vision and speech apps to make government services more “actionable and digestible” both for citizens but also for the developers and program managers on the backend.
The Cognitive Services also give agencies a way to adapt the programs for themselves, according to Steve Michelotti, senior program manager with Azure Government.
“We are taking the complex stuff from machine-learning models so you can put it into APIs and you can consume the services for yourself and your own applications,” Hausman said. “We want to democratize AI services.”