Putting remote control of robots within everyone's reach
Connecting state and local government leaders
Researchers are developing software that works with off-the-shelf virtual reality headsets to put remote control of robots within the reach of any developer.
As drones have evolved, so have their controllers. Pilots have access to keyboard interfaces, joysticks, videogame-like controllers, devices that attach to smartphones and tablets and proprietary systems to position their drones. But these control devices use a two-dimensional view to manipulate objects in 3D and may not be sophisticated enough for complex tasks.
To address this need, researchers at Brown University are developing technology that puts remote control of robots within the reach of any developer with an off-the-shelf virtual reality headset, such as the HTC Vive or Oculus Rift.
The software, written in the same Unity programming language that underlies the most popular VR headsets, connects a robot’s controllable features -- arms, grippers, cameras, sensors, etc. -- to the controls of VR systems over the internet. That would allow, for example, a user to employ the VR system’s hand-held controller to move a robot’s arm or to manipulate objects. And by connecting to the robot’s sensors, the operator can work in the remote environment seeing what the robot sees.
"We think this could be useful in any situation where we need some deft manipulation to be done, but where people shouldn't be," David Whitney, a graduate student in Brown's computer science department, said. "Three examples we were thinking of specifically were in defusing bombs, working inside a damaged nuclear facility or operating the robotic arm on the International Space Station."
Whitney and his co-developer, undergraduate student Eric Rosen, both work in the university’s Humans to Robots Laboratory, which is focused on developing collaborative robots that use sensors to perceive the world, effectively communicate with people and help meet their needs.
One of the biggest challenges in such collaboration is keeping communications compact enough to travel over non-dedicated channels, such as the internet, without degrading performance. In testing so far -- admittedly with limited types of robotic operations -- the Brown team’s software has met that challenge.
In testing, 18 novice users were able to complete a cup-stacking task 66 percent faster when controlling a robot using the virtual reality system compared with using a traditional keyboard-and-monitor interface.
The key to their effort, Whitney said, is to develop a non-proprietary system that can be used with virtually any equipment.
“It connects over ROS, the robot operating system,” Whitney said. “If you go to any lab across the country, odds are that all of their robots run ROS.”
ROS, in fact, is a story has been relatively under the radar in most coverage of robotics' rapid development.
It’s difficult even to pinpoint a birthdate for ROS, though some researchers point to the provision of code and resources in 2007 Willow Garage, a robotics incubator close to Stanford University, as the primary driving force.
The collaboration in development of code and the resulting unofficial standard ways of doing things has led to rapid advances in capabilities, including those development by the Brown team. And the Brown team wants to return the favor.
“We want our software to be freely and widely used,” Whitney said. The software is publicly available for download on Github.
NEXT STORY: Do drones have a future in DOTs?