Researchers try to quantify data
Connecting state and local government leaders
Study to analyze uses, opportunities and cost of information and data.
Researchers from universities and industry have announced a three-year multidisciplinary study to quantify the amount and kind of information being produced worldwide and how it is being used.
The goal of the 'How Much Information?' study is to move beyond the bits and bytes of previous studies and look at the uses, opportunities and cost of information and data, said Program Director James Short of the University of California at San Diego.
'The previous generation of studies were primarily based on sales forecasts for storage equipment and rudimentary usage patterns of that equipment,' Short said. But most information is not stored; it is being managed and processed. Understanding and measuring those processes is one of the goals of the program. 'Once you improve your measurements, you improve your understanding of the factors driving the growth. You can get on with the business of projecting the implications.'
The amount of data being produced and used is rising dramatically as the cost of producing, moving and storing it is reduced. 'That curve will continue,' Short said, and be a strong driver for information technology and have an effect on our economy and lives. Predicting the degree of that impact will require a better understanding of data.
Specialists from the University of California-San Diego (UCSD), University of California-Berkeley and Massachusetts Institute of Technology are conducting the program. It will be housed at the Global Information Industry Center (GIIC) at the UCSD School of International Relations and Pacific Studies, where Short is a research director. Industry experts from AT&T, Cisco Systems, IBM, LSI Corp., Oracle, Seagate Technology and the Palo Alto Research Center also will participate. The program will be supported by the San Diego Supercomputer Center and Jacobs School of Engineering.
Participants will be engineers, computer scientists, information economists and IT research faculty.
The study will look at home IT users in addition to public- and private-sector users.
Pervious studies have focused on bits and bytes of data, relatively easy things to quantify. But 'any discussion of information has to include the use and context' of that data, Short said. 'We will be focused very much on the differences between data growth and uses of that data.'
The study will look not only at the amount of data being transmitted and stored but also at factors such as density of network traffic, rate of growth, patterns and characteristics of use, and the rates of processing by applications. It will use statistical sampling to gather data, and the San Diego Supercomputer Center will use reference data sets in modeling to extrapolate current use and future trends.
An initial report is expected by the end of the year with updates on research appearing periodically during the next three years. Three years might seem like a short time for such an ambitious project, but Short said the newly formed GIIC will provide a platform for a sustained look at the issue.
'This is a process of successive better looks at a complex issue,' he said.
NEXT STORY: Adobe, Google, IBM challenge Microsoft Office