Training responders to use augmented reality, exoskeletons
Connecting state and local government leaders
Researchers have developed a mixed-reality learning environment that uses exoskeletons and augmented reality to improve responders’ safety and performance by enhancing their physical and cognitive capabilities.
Researchers have developed a mixed-reality learning environment that uses exoskeletons and augmented reality to train emergency responders and improve their safety and performance by enhancing their physical and cognitive capabilities.
The Learning Environments with Augmentation and Robotics for Next-gen Emergency Responders (LEARNER) is a mixed-reality, cloud-based learning platform designed specifically to train emergency responders to use human augmentation technologies (HATs).
LEARNER was originally funded in 2019 by the National Science Foundation’s Convergence Accelerator, which integrates multidisciplinary team-based science addressing national-scale societal challenges. In August, NSF awarded follow-on funding to the partners at the University of Florida, Virginia Tech and Texas A&M University and Knowledge Based Systems Inc., and Sarcos Robotics. The Public Safety Research Division UI/UX group at the National Institute of Standards and Technology will also work with the team.
HATs, such as exoskeletons and AR, can dramatically and improve the safety and performance of responders. Powered exoskeletons can give rescue personnel the extra strength to lift heavy objects, minimizing their fatigue, reducing injuries and preserving their autonomy and decision-making abilities. With shared AR programs, command staff can guide responders wearing at AR headsets, giving them real-time scene annotations on the location of victims, exits and potential hazards. Unfortunately, there are currently few training platforms that can integrate customizable solutions to teach responders how to use HAT, according to the project’s website.
The LEARNER project aims to develop and assess the physical, augmented- and virtual-reality technology for responders and build a mixed-reality learning platform with and AR and exoskeleton-specific learning modules that are optimized and continuously adapted for individual responders and their environments. It uses artificial intelligence to analyze biometric and behavioral data to personalize training to a responders’ learning needs
Ultimately, researchers want to create an open-source knowledge sharing platform that will speed the integration of training for HATs not just responders, but workers in construction, energy, manufacturing and health care.
“LEARNER is a personalized learning platform that will incorporate physiological, neurological and behavioral markers of learning into real-time emergency response scenario evolution,” said Texas A&M’s Ranjana Mehta, principal investigator on the project and director of the NeuroErgonomics Lab. The training will be accessible via laptops, AR headsets and haptic suits at field houses and in-situ emergency response training centers, she said.
“Imagine if health care workers are quickly able to learn how to use powered exoskeletons using LEARNER -- fewer workers would be needed for safer patient handling, thereby potentially reducing the spread of COVID-19-related infections,” Mehta said. “The award will accelerate our efforts to make immediate impacts to address challenges of national importance such as this.”
NEXT STORY: Smithsonian retools for post-COVID visitor tech