Data center consolidation improves cross-agency cooperation
Connecting state and local government leaders
The federal Data Center Consolidation Initiative emerges as an enabler of information sharing across government.
The government’s data center consolidation effort is emerging as an enabler of information sharing between agencies, a panel of intelligence and security officials said Tuesday.
The Homeland Security Department began five years ago the task of folding 24 data centers into two, said Deputy Chief Information Officer Margaret Graves. The two recently went into operation. “We also are availing ourselves of the opportunity to modernize,” and to make data accessible across the enterprise and not just to the system or information owner, Graves said.
Policy enforcement to provide access management controls in the new centers ensures that data can be secured at the appropriate level, she said. This means that agencies do not have to default to the highest common denominator in placing security controls on the data, so it can be made available to the appropriate personnel from various agencies.
Related coverage:
Flatten your network to consolidate data centers
Army steps up data center consolidation after imposing server moratorium
Graves spoke on the challenges of connecting the dots in intelligence at a government information technology forum hosted in Washington by InformationWeek. Also speaking were Don Burke who helps to oversee the Intellipedia wiki at the CIA, and Casey Henson, CIO of the Defense Intelligence Agency.
Although DHS has been working on consolidation for five years, the federal Data Center Consolidation Initiative was announced in February and requires agencies to create a preliminary plan in place by June 30 with a final plan by the end of the year. Consolidation is to begin in 2011. According to the Federal CIO Council the purpose of the program is to reduce cost while improving energy efficiency, increase IT security and promote the use of more efficient computing platforms such as cloud computing.
But Henson said consolidation also will help DIA provide broader access to data, enabling data brokering with effective identity management and access controls.
“The intelligence community has embraced the requirement to provide information instead of just protect it,” Henson said.
Separating the data from the service by making it available in common data centers is one of three initiatives with the intelligence community to allow sharing, she said. Identity and access management and interoperability across agencies and security classifications are the other two, she said. Identity and access management probably is the hardest nut to crack, she said.
“We have been working on this for a decade,” she said. “We saw if we don’t do that, we will never be interoperable.”
DIA, CIA and the National Security Agency now are sharing an authentication and access scheme, and additional intelligence agencies are expected to join within the next 12 to 18 months, Henson said.
DHS also hosts the program management office for the National Information Exchange Model, an XML-based framework to automate sharing across federal, state and local government as well as the private sector. NIEM began in the Justice Department before moving to DHS, and 10 agencies now are using the framework, Graves said. NIEM allows queries across multiple databases to produce results on a single screen.
NIEM and data center consolidation are large top-down projects, but there also are grassroots efforts to enable information sharing, such as the Intellipedia, a user-created wiki intended to allow collaboration within the intelligence community.
“It is a suite of tools, not just one tool,” said Burke. It is a set of Web 2.0 applications, launched in 2006, running on top secret, secret and unclassified networks of the intelligence intranet and hosted by the Office of the Director of National Intelligence. Its adoption has come gradually over the last four years and it now is being used by White House advisors and analysts, he said.
Burke said that developers have to be able to build on emergent technologies such as the wiki to address future capabilities and needs, rather than relying on top-down initiatives or formal user requirements. New tools cannot be developed according to what users want, because what they want are more and better versions of existing tools, he said.
But once an emergent technology has been adopted, top-down standards and implementation guidance are needed to ensure interoperability, Henson said.
NEXT STORY: GSA plans e-mail system revamp