Conversational AI eases soldier-robot teaming
Connecting state and local government leaders
The Army Research Laboratory is working on bi-directional conversational technology that will make it easier for soldiers to team with autonomous systems.
To make it easier for warfighters to interact with robotic systems without checking a screen or using a joystick, the Army Research Laboratory (ARL) is working to enable bi-directional conversational artificial intelligence that will give soldiers command and control of autonomous systems and speed the performance of joint soldier-robot tasks.
Unlike consumer-grade conversational assistants that retrieve information from public sources, the Joint Understanding and Dialogue Interface is designed for reasoning tasks in the physical world where data is sparse and there is little to no reliable cloud-connectivity. And whereas Alexa and Siri required thousands of training examples, JUDI can learn a task with only hundreds of examples, an order of magnitude smaller, according to an ARL report.
“Instead of relying on pre-specified, and possibly outdated, information about a mission, dialogue enables these systems to supplement their understanding of the world by conversing with human teammates,” ARL Research Scientist Matthew Marge said. In tactical operations, JUDI allows soldiers to issue verbal instructions to a mobile robot, which can, in turn, ask for clarification or provide updates on the task’s status.
JUDI’s dialogue processing interprets a soldier’s intent from spoken language. It is based on a statistical classification method that was trained on a small dataset of human-robot dialogue where human experimenters stood in for the robot's autonomy during initial phases of the research, Marge said.
When the technology is adapted to autonomous systems, robots can access multiple sources of context, including the speech of soldiers and their own perception systems, to help in collaborative decision-making.
“JUDI’s ability to leverage natural language will reduce the learning curve for soldiers who will need to control or team with robots, some of which may contribute different capabilities to a mission, like scouting or delivery of supplies,” Marge said.
JUDI will be integrated into the Combat Capabilities Development Command’s ARL autonomy stack, a suite of algorithms, libraries and software components that perform specific functions required for navigation, planning, perception, control and reasoning, ARL officials said. Successful innovations in the stack are also rolled into the CCDC Ground Vehicle System Center’s Robotics Technology Kernel, where they are put through extensive testing and hardening for use robotic combat vehicle programs.
In September, researchers will evaluate JUDI’s robustness with mobile robot platforms at an upcoming field test.
“Our ultimate goal is to enable soldiers to more easily team with autonomous systems so they can more effectively and safely complete missions, especially in scenarios like reconnaissance and search-and-rescue,” Marge said. “It will be extremely gratifying to know that soldiers can have more accessible interfaces to autonomous systems that can scale and easily adapt to mission contexts.”
NEXT STORY: DHS tests counter-drone tech