Machines learning to work with humans … and each other
Connecting state and local government leaders
Recent research shows machines are getting smarter and more agile, giving them the potential to replace humans in a wide array of jobs.
GCN’s editorial focus doesn’t include the policy implications of technologies. If it did, I’d be writing today about how the growth of robotics and machine intelligence present equally incredible opportunities and dangers for the workforce – and hence, for the economy – of this country.
After all, machines are increasingly relieving us humans of the need to do some very repetitive and, in some cases, dangerous jobs. Military drones, for example, keep human pilots out of harm’s way, just as industrial robots do work that is incredibly tedious and sometimes dangerous. As machines are made smarter and more agile, however, they have the potential to replace humans in a wide array of other jobs. And the growing use of machines is likely to present challenges to workers in many sectors.
While I won’t jump into the policy side of the issue – other than to note the stakes are momentous – I want highlight some of the more portentous research of the past year. The bottom line: machine capabilities are advancing much more rapidly than I expected.
Teaching robots to work with humans
Results of its testing are not yet available, but the U.S. Navy’s Office of Naval Research has been experimenting with deploying humanoid Shipboard Autonomous Firefighting Robots (SAFFiR) to fight fires on naval vessels.
The two-legged, 6-foot tall robot is being challenged to maneuver through a decommissioned destroyer to locate a fire, using infrared and laser sensors, and to open water valves and then drag a fire hose to douse the fire. The SAFFiR is also being tested to search for victims.
So far, this is pretty much standard robot work. But what makes the Navy’s current work with SAFFiR really interesting is that the researchers are developing machine intelligence that will allow the robot to communicate efficiently with human counterparts using natural-language communication in real-world situations.
"We're really working toward having a robot that can work closely with people," Thomas McKenna, program officer in the Office of Naval Research's Warfighter Performance Department in the Human-Robot Interaction Division, told a Computerworld reporter. "That's the way firefighting teams work. Typically, there's a nozzle man in front, which the robot will be here. We want to enable that same interaction. We want the robot to operate like it's a sailor."
Teaching robots to work with each other
Along those lines, researchers at MIT are developing software to enable multiple robots to coordinate actions with each other.
The researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) are testing the software using iRobot Creates, which have the same chassis as the Roomba vacuum cleaner, to retrieve objects from unspecified locations in a warehouse and collaborate when they need to move heavy objects.
"In [multiagent] systems, in general, in the real world, it's very hard for them to communicate effectively," Christopher Amato, a postdoctoral researcher at CSAIL, told a reporter. "If you have a camera, it's impossible for the camera to be constantly streaming all of its information to all the other cameras. Similarly, robots are on networks that are imperfect, so it takes some amount of time to get messages to other robots, and maybe they can't communicate in certain situations around obstacles."
The as-yet-unnamed program that Amato and his colleagues are developing tracks the actions and locations of each robot and measures the amount of time and energy expended to successfully accomplish tasks. The program automatically processes the data to learn the most efficient ways of jointly accomplishing the assigned tasks.
Teaching robots to teach each other
One of the coolest steps toward a SkyNet (of Terminator fame) is Robo Brain. A joint project by researchers at Cornell, Stanford, Brown and U.C. Berkeley, Robo Brain is a cloud-based repository of data that any robot with Internet access can draw upon to learn how to perform tasks better.
"I'm really looking forward to building this brain, with all this information that robots need," Ashutosh Saxena, an assistant professor in the computer science department at Cornell University, recently told a Computerworld reporter. "Instead of teaching each piece of knowledge to each robot, when a robot goes out in the real world, it can query the brain and learn how to do things."
Robo Brain, which is stored on Amazon Web Service’s cloud, already has approximately 10 terabytes of data that can tell robots the characteristics of objects, from tables to cars, and the steps involved in performing tasks, such as opening a door.
“Our laptops and cell phones have access to all the information we want," Saxena said. "If a robot encounters a situation it hasn't seen before, it can query Robo Brain in the cloud."
Currently, only the four universities that are collaborating on Robo Brain have access to the repository, but the team expects to expand access to others soon.
How long is your job safe?
And yes, machines are increasingly being used to write articles. They actually do a pretty good job when applied to such topics as synopses of sporting events and stories about companies’ financial reports. I’ll be curious to see how long it takes for a machine to win a Pulitzer Prize.
NEXT STORY: Disaster dashboard taps local open data