NSF seeks theoretical limits of computation
Connecting state and local government leaders
The National Science Foundation plans to fund research into the theoretical limits of computation and information transfer.<br>
How much data can be computationally analyzed? How much data can be sent through a network? The National Science Foundation will spend $32 million to answer just such questions.
This spring, NSF will dole out the funds to agencies, universities and research labs to determine the theoretical limits of computation and information transfer.
'This [work] will address some fundamental problems in computer science,' said Kamal Abdali, director of NSF's Division of Computing and Communication Foundations. 'How much information can you send over a fixed channel? Is something computable? And if something is computable, how many resources are needed?'
In a solicitation released late last month, the agency said it wants to 'determine inherent limits of computation and communication, and to obtain optimal solutions within those limits.' The agency expects to award 90 to 100 grants, each averaging about $125,000 for three years of research. The deadline for submitting proposals is March 4.
The work will draw from studies in computer science, scientific computing, communications and signal processing theory, mathematics and other fields.
In a 1948 paper, "A Mathematical Theory of Communication," Bell Labs mathematician Claude Shannon established information theory as a field of scientific inquiry, defining fundamental rules concerning the transfer of information.
NSF will expand on Shannon's work by looking into computational complexity, or how much information can be processed by computational means, Abdali said.
'Depending on different models, we might find different limits,' he said. Approaches such as parallel or distributed computing may increase the ability to solve complex problems, for example.
The solicitation identifies parallel research tracks in communications and computing. Each track will look at developing core theories, or basic models of computational complexity. Each track will also develop fundamental algorithms and application-specific techniques to solve specific problems.
Obtaining a better mathematical understanding of computation will be vital in boosting computational power, Abdali said. Although advances in microchip fabrication do help successive generations of computers process more data than previous models, advances in algorithms can help speed processing as well, he said.
'Algorithmic advances have been more significant than the hardware advances' in speeding processing times, Abdali said. He pointed to advances in sorting techniques that have dramatically cut the time needed to sift through large amounts of information.
NEXT STORY: VA to expand central network to all facilities