Researchers take a new tack on supercomputing
Connecting state and local government leaders
Veteran computer designers are beginning a five-year, $13.5 million project to design a so-called OptIPuter that would eliminate traditional bandwidth bottlenecks of large servers.<br>
A team of veteran computer designers has just begun a five-year, $13.5 million project to design a so-called OptIPuter that would eliminate the traditional bandwidth bottlenecks of today's large servers.
Using National Science Foundation funding, the researchers want to design a computer with components that will talk to each other through fiber optics. Even more radically, the researchers want to rethink the way a supercomputer is structured, said Gregory R. Hidley, head of technology infrastructure at the California Institute for Telecommunications and Information Technology (CalIT2) of La Jolla, Calif.
The project's principal investigator is Larry Smarr, former head of the National Center for Supercomputing Applications in Champaign, Ill.
For several decades, microprocessor speeds have roughly doubled every 18 months, a rule of thumb known as Moore's Law. But communications between a large system's processors and its memory and disk storage has been a restriction.
Optical networks now have gotten to the point where bandwidth is doubling roughly every nine months, Hidley said.
In the proposed OptIPuter, to be built in San Diego, the networks and interconnects will be as fast as the processors if not faster, Hidley said.
The OptIPuter will use high-end optical switches and routers from Richardson, Texas, startup Chiaro Networks Ltd.
The optical switches can transmit up to 6.3 trillion bits per second, said Steven J. Wallach, vice president for technology at Chiaro Networks and a co-founder of former supercomputer maker Convex Computer Corp.
The CalIT2 team aims to make the OptIPuter compatible with grid-computing projects that are springing up in the research community.
Ultimately, OptIPuter technology could be used in a vast infrastructure to gather streams of data from weather and ground sensors, Hidley said. Such technology could be useful to agencies responding to natural disasters and terrorist attacks.
High-bandwidth computing could also make telemedicine easier by speeding the transfer of large data files from medical imaging devices, said Philip M. Papadopoulos, program director for grid and cluster computing at the NSF-funded San Diego Supercomputer Center.
NEXT STORY: GSA to lower schedule fees by a quarter percent