DARPA seeks AI assistants to help with complex tasks
Connecting state and local government leaders
The Defense Advanced Research Projects Agency wants to explore how artificial intelligence-enabled assistants can provide just-in-time visual and audio feedback to help users expand their skillsets and minimize errors.
The Defense Advanced Research Projects Agency wants to explore how artificial intelligence-enabled assistants can provide guidance -- in the form of just-in-time visual and audio feedback -- to help users expand their skillsets and minimize errors or mistakes.
In the Perceptually-enabled Task Guidance (PTG) program, humans would wear microphones sensors, head-mounted cameras and an augmented reality headset that all send and receive data. The system could help medics or mechanics, for example, by understanding what they’re working on and offering AR-based instructions to help them perform complex tasks.
“These sensor platforms generate tons of data around what the user is seeing and hearing, while AR headsets provide feedback mechanisms to display and share information or instructions,” said Bruce Draper, a program manager in DARPA’s Information Innovation Office. “Developing virtual assistants that can provide substantial aid to human users as they complete tasks will require advances across a number of machine learning and AI technology focus areas, including knowledge acquisition and reasoning.”
DARPA described three use cases. One: The human can initiate the interaction by asking, "What do I do next?" and then receive instructions. Second: If the human makes a mistake, the AI can issue a warning and suggest remedial action. Third: When a task is new, the AI can walk the human through the various steps of the task.
“The goal is to enable mechanics, medics and other specialists to perform tasks within and beyond their skillsets by providing just-in-time feedback and instruction for physical tasks,” DARPA wrote broad agency announcement. The technology would also help reduce errors.
The agency is looking to exploit advances in deep learning, automated reasoning and augmented reality. through novel approaches and integrated technologies that address four key, interdependent problems:
- Knowledge transfer. The assistants must automatically acquire task knowledge from materials intended for humans, such as checklists, manuals and training videos.
- Perceptual grounding. Objects, settings, actions, sounds and words recognized by the assistant must align with the terms used to describe tasks so observations can be mapped to task knowledge.
- Perceptual attention. Assistants must pay attention to percepts that are relevant to the task, while ignoring extraneous stimuli. They also must respond to unexpected events that may alter the user’s goals or suggest a new task.
- User modeling. Assistants must determine how much information to present to the user and when to do so.
The agency is looking for proposals in two technical areas.
Area one is for fundamental research into the four key problems. Area two is to demonstrate technologies that address the four key problems in a militarily relevant scenario in one of three broad areas -- mechanical repair, battlefield medicine or pilot guidance.
DARPA expects to make six awards for area one worth a total of $30 million, followed by a pair of awards in area two worth $10 million.
A proposers’ day is scheduled for March 18 with abstracts due March 31 and final proposals having a deadline of May 14.
This article was first posted to Washington Technology, a sibling site to GCN.