Tap value of big data with strategy, infrastructure and roadmap
Connecting state and local government leaders
With computing costs coming down, agencies can unlock capabilities for analyzing and finding hidden value in data.
Government agencies have worked for years on complex, analytic projects in many domains before the term big data came along.
What has changed is that the cost of computing has come down, unlocking capabilities for agencies to analyze and find hidden value in data, said Steve Mills, co-chair of the TechAmerica Big Data Commission and senior vice president and group executive with IBM.
Data analysis, of course, generates more data. And the emergence of cloud computing has enhanced and accelerated the analysis and delivery of big data.
"Cloud computing is an enabler of what we are calling big data," said Teresa Carlson, a vice chair of the commission and vice president of the global public sector at Amazon Web Services. Projects that took months and years can now actually take days. Projects that cost millions of dollars in the past now cost thousands of dollars.
"We have examples of companies that have done things with 50,000 cores of processors in three hours where it would have taken 12 years to process the data" in the past. "This represents an IT shift," Carlson said, adding that real value is getting information from the data.
Mills and Carlson were speaking to a Capitol Hill audience Oct. 3 upon the release of the commission’s report, "Demystifying Big Data: A Practical Guide To Transforming The Business of Government." Among the report’s recommendations is that, because of the crucial role big data will play, the federal government and individual agencies need to consider creating the position of chief data officer.
Big data is all about getting the right information to the right people, said Bill Perlowitz a vice chair of the TechAmerica Big Data Commission and CTO at Wyle, a provider of technical services to the government and businesses. "If you can get the right health care data to a health care practitioner, you can change the outcome of health care. If you get the right security information to an intelligence officer, you can enhance national security. If you get the right data to an entrepreneur, you can help create jobs."
To make data a strategic asset and achieve mission outcomes, agencies need to start including data in their plans and their enterprise architectures, Perlowitz said. That idea was expressed in the White House Digital Government Strategy issued in May, which cited the need "to unlock the power of government data to spur innovation across the nation and improve quality of services to the American people."
The commission recommends that the federal government create an information strategy to help guide the big data vision. This strategy would include the development of a national information infrastructure that would consist of the technology capabilities needed to establish a common information framework for the sharing of data, Perlowitz said.
In recent years, federal, state and local government agencies have struggled to handle the sheer volume, variety and velocity of data that is created within their own enterprise and across the government. As officials attempt to navigate the tidal wave of data, they wonder how they can make intelligent decisions and derive real value from big data.
The commission report also contains big data case studies that identify existing and new challenges that will require greater evolution of their IT infrastructures and technology. For example, the National Archive and Records Administration is currently struggling to digitize over 4 million cubic feet of traditional archival holdings, including about 400 million pages of classified information scheduled for declassification, pending review with the intelligence community.