Defense 2.0 a work in progress
Connecting state and local government leaders
The Defense 2.0 era is still years away as culture, inertia, and IT security concerns that grow only more complicated in a Web 2.0 world continue to challenge military IT leaders.
The vision of adapting next-generation Internet technologies in the Defense Department is gaining new urgency as businesses, and employees, increasingly embrace Web-based social networking applications. But the horizon for what some call the Defense 2.0 era appears a long way off as culture, inertia, and IT security concerns that grow only more complicated in a Web 2.0 world, continue to challenge military IT leaders.
The gap between vision and reality was plainly evident as senior DOD officials and industry experts debated the implications of Web 2.0 technologies yesterday at a conference held by the Information Technology Association of America.
Robert Carey, chief information officer of the Department of the Navy, was among those who have seen Web 2.0 applications making positive contributions to the work of DOD. Carey, who has gained notoriety for being among the highest-ranking CIOs in DOD to author his own blog, also is an active wiki user.
'Wikis offer in my world [a better way] to write policy,' he said. In the old paradigm, it could take 18 months to complete a policy document, with all the comments back and forth, he said. 'Now I can have something in 60 days. I tell people, 'Write the words like you want them, not your comments.' '
Carey has also been impressed with how quickly IT solutions have emerged on the fly using mashups of multiple Web-based applications. He cited the Combined Information Data Network Exchange as one example. What began as a do-it-yourself project in the field using Google maps to track battlefield actions, including indirect fires and improvised explosive devices events, quickly gained wider use by battlefield commanders. It might have taken years ' and millions of dollars ' to build something comparable under traditional IT approaches in DOD.
John Hale, chief of solutions delivery for the Office of the Director of National Intelligence, which spawned Intellipedia, sees Web 2.0 differently.
'Web 2.0 in my world has nothing to do with technology. It's all about enabling the end user and the data,' he said. With 78,000 registered users in the intelligence community and 250,000 additional individuals with qualifications to view the data, 'Intellipedia has been has been the single most ground-shaking way to share info in the community.'
The challenge of Web 2.0 applications, however, remains how to engineer security safeguards into what amount to ad hoc systems.
The ideal approach is to make security part of system design from the beginning. 'The problem is when you cobble things together,' said Brig. Gen Ronnie Hawkins, Deputy Director, Policy and Resources, SAF/XCP-2.
'Many of our coalition partners use collaboration tools more than we do at DOD,' he said. 'We're more strict and locked down. I appreciate having lived on the edge why we need to do that,' he said. But he's also seen the speed and advantages Web 2.0 tools can offer. 'It comes down to behavior in many regards.'
'We should not get ahead of ourselves and allow next generation capabilities until we understand how it architecturally operates,' said Mitch Komaroff, Assistant Secretary of Defense for Networks and Information Integration. 'We haven't solved the last generation.'
Chris Daly, technical competency lead for security, IBM Federal Division, was among those who were more blunt about the technical risks of Web 2.0 applications.
Web 2.0 offers 'the promise of worse security,' he said. 'Web 1.0 at least had a security model, relying on a browser, SSL [secure socket layer protocol], and the [security built into] web servers. Web 2.0 was built using cross-site scripting,' creating opportunities for 'man in the middle attacks.' The fix would involve resolving standards that certify data and its source and redesigning browsers to conduct and manage multi-site certifications, he said.
Mashups present a particular challenge, he said. Without greater administrator control, mashups can disclose more data than intended.
Work is under way, he said, at IBM to provide improved security for enterprise mashups, with a technology named SMash (short for secure mashups). It offers a secure component model, where components are provided by different trust domains which can interact using a communication abstraction that makes it relatively easy to specify security policies. In theory, a technology like SMash can unlock data and services so they can inter-work in ways which allow data to be used more synergistically.
Cloud computing
What wasn't in question at the conference is how rapidly the world is migrating to and relying on the Web.
'We're entering a fundamentally new ' and third ' phase of the Internet, as profound as the World Wide Web,' where data and applications reside somewhere other than on stand alone computers or Web sites, and which can be cross-linked in new and powerful ways, said Michael R. Nelson, visiting professor, Internet Studies, Georgetown University's Communication, Culture, and Technology Program.
The massive shift of activity to the Internet has already begun to unleash an 'exaflood' of online traffic, said Nelson, using a term coined by Bret Swanson and George Gilder of the Discovery Institute. He cited their most recent study, released in January, which estimates monthly Internet traffic will double between 2007 and 2011, to approximately 10 exabytes of data. An exabyte is one billion, billion bytes of information. That figure will grow tenfold by 2015 as video, gaming, cloud computing, and business IP traffic mushrooms, he said.
Nelson predicted that 80 percent of all computing will be done in the cloud in as little as five years, but more likely in 10 years, as technical agreements, business practices and most of all, cultural barriers promoting cloud computing are resolved.
'CIOs and their teams will have to adapt to new roles,' he said. 'They won't be involved in managing their own servers, but [access] to someone else's infrastructure' and helping employees to use this new technology.
Lewis Shepherd, chief technology officer, Microsoft Institute for Advanced Technology in Governments, and a former senior technology officer at the Defense Intelligence Agency, suggested that there were many reasons why not everything will be done on the Web.
As an outgrowth of Moore's Law, the evolution of mini- and multi-core processors will continue to make computing available in increasingly smaller client form factors, he said, in some cases, invisible and other cases embedded. Consequently, 'we don't see everything going to cloud computing,' Shepherd said.
He also suggested that some of the security concerns of Web 2.0 will be resolved as data protection solutions evolve.
Today's mashups turn the browser into a multi-user system, he said, noting that mutually distrusting domains become co-users. There's no control on content integrated from different domains.
The answer won't be in the browser, but in the data and in looking at the cloud like an operating system. That's 'why we're placing a lot of [research and development] effort on a cloud-based approach as the right path.' It's also why 'I'm incredibly optimistic about the Web 3.0 -- the semantic Web -- which is all about the value of embedded data.'
NEXT STORY: Tighter security for IE8