Are we thinking about artificial intelligence all wrong?
Connecting state and local government leaders
Computer scientist Jerry Kaplan talks about automation and its implications for many government functions.
The robots are coming. They don’t want our jobs exactly, but work as we know it will be transformed nonetheless.
So argues computer scientist Jerry Kaplan, who’s been immersed in the intersection of technology, artificial intelligence and traditionally human tasks for nearly 40 years. The author of “Humans Need Not Apply: A Guide to Wealth and Work in the Age of Artificial Intelligence,” Kaplan argues that huge changes are indeed underway -- they’re just not the ones that are often discussed.
First of all, “we don’t automate jobs -- we automate tasks,” Kaplan said during a mid-January discussion at Mitre organized by Hooks Book Events. And that process, he argued, has been underway since at least the Industrial Revolution.
According to Kaplan, the problem is that too much attention is being paid to the idea of ever-smarter machines and software -- that sentient computer systems are evolving rapidly enough to replace humans in all sorts of roles. The term “artificial intelligence,” he said, distorts the public discourse.
“Machines don’t think,” Kaplan declared -- certainly not in the way that humans do --and “it’s really little more than an analogy to say they think at all.”
And although there have certainly been advances in machine learning, he added, “there’s very little evidence that machines are on the path to becoming thinking, sentient beings.” To argue that current AI shows we’re headed in that direction “is like climbing a tree and claiming progress in getting to the moon.”
“The real issue is that, when it comes to work, human intelligence is somewhat overrated.” Kaplan said. And that reality has huge implications that are being overshadowed by the popular discourse about AI.
“If you have enough data, you can solve tasks that used to require intelligence,” he said. And just because machines can perform tasks that humans use intelligence for, that doesn’t mean machines have to think in order to do so.
Kaplan cited translation and text analysis as two classic examples. “Machine translation bears almost no resemblance to the human process,” he said, but the results can be remarkably accurate. And in jobs ranging from truck driver to surgeon, he argued, much of what humans do with thought, training and expertise really boils down to serving as a human sensor to connect inputs with appropriate responses.
“Our ability to interpret data coming from sensors -- in combination with lots and lots of sensors, which have gotten so much cheaper” -- has improved to the point that “an incredible range of tasks” can now be automated, Kaplan said. “So we can build systems and machines that are able to sense and be aware of their environment in very different ways than before.” And because “humans are actually not very good sensors,” that means millions of jobs will soon be transformed beyond recognition.
Artificial intelligence has huge ramifications on the order of “the wheel or the steam engine,” he added. “But it’s not magical. And we’re well on our way to making a mess of things.”
The new technologies and the disruption they cause will eventually create new jobs, Kaplan told GCN after his presentation. He referred to the oft-cited decline in agricultural jobs -- whose workers now constitute just 2 percent of the U.S. workforce, down from more than 90 percent 150 years ago -- and the creation of countless new jobs that would have been inconceivable to a Reconstruction-era American.
Life is far better now that every worker is not stuck behind a plow, he said. And he predicted that in the future, “it eventually may take only 2 percent of the population, coupled with some pretty remarkable automation, to accomplish what it currently takes 90 percent of our population to accomplish.”
“I’m supremely confident that our future is bright,” Kaplan added. “I see no reason that this pattern won’t continue. But the key word here is ‘eventually.’”
It takes time for these transitions to happen and for new types of work to emerge, he said. “And AI is going to accelerate the pace of this job destruction and job transformation.”
So why should federal technologists care? Or rather, why should they care more than any professional whose job could be hollowed out and de-skilled by ever more capable automation?
For starters, many government missions could benefit from the explosion of sensor-driven data and machine learning. Thinking about AI as a natural continuation of automation -- and being open to replacing “human sensors” in existing processes -- could allow agencies to radically improve their effectiveness.
And Kaplan said it’s equally important to think about the types of tasks that are not very susceptible to automation. Jobs that include a broad range of responsibilities and especially those that deliver person-to-person interaction will require human workers for the foreseeable future, so agencies aiming for better citizen service would do well to build their systems in ways that put real people at the critical personal touch points.
At the broader policy level, the government must consider what’s required to serve a citizenry that could soon face structural unemployment on a massive scale. Does that mean a reinvention of vocational training? Investment in infrastructure to speed the creation of new forms of work? Tax policies to encourage a broader sharing of the economic benefits automation can produce for business owners? A Public Works Administration for the Digital Age?
Kaplan was quick to stress that his expertise is in technology, not public policy, but he said bold solutions will be required.
“Automation puts people out of work,” he told GCN. “That’s almost the definition of it. And my personal point of view is that it’s not the job of the innovators to take care of the people they’re displacing.”
Nevertheless, “somebody else has to step up,” he said. “And that somebody either is government or is facilitated by government policies.”
NEXT STORY: Report: Countering the hostile use of drones