Greater supercomputer coordination urged
Connecting state and local government leaders
The White House task force and House members expressed the need for greater coordination in two areas of supercomputing ' sharing existing resources and defining a coherent map for development of large-scale machines.
A report issued by the White House's Office of Science and Technology Policy recommends that agencies with supercomputers work more closely to share and develop resources. New House legislation introduced last month also calls for greater interagency collaboration, as well as central oversight of federal supercomputing resources.
Both the White House task force and House members expressed the need for greater coordination in two areas of supercomputing'in sharing existing resources and in defining a coherent roadmap for future development of large-scale machines.
John Marburger, director of the policy office, unveiled the report in a House Science Committee hearing today on HR 4218, the High Performance Computing Revitalization Act.
The report, Federal Plan for High-End Computing, was conducted by the White House's High-End Computing Task Force (Click for GCN story). The task force recommended that agencies with supercomputers coordinate in order to provide more computational power for themselves and other users.
Introduced by Rep. Judy Biggert (R-Ill.) on Apr. 28, the High Performance Revitalization Act would establish an interagency advisory committee to oversee a road map for supercomputing development.
'The interagency planning process has lost the vitality it once had,' Biggert said, referring to the High Performance Computing Act of 1991, which HR 4218 would amend. Biggert's bill would require Marburger's office to 'direct an interagency planning process and develop and maintain a road map for the research, development and deployment of high performance computing resources,' she said.
Both efforts address a perceived loss of global leadership in the realm of supercomputing technologies. Today, agencies are experiencing difficulties in obtaining the computational power they need for large research projects, Marburger said. The drive towards clustering commercially available servers has caused the market for high-performance computers to shrink, driving up costs. So those agencies with needs that can't be served by clusters are particularly hard hit.
Speaking before the committee, Dan Reed, a professor at the University of North Carolina at Chapel Hill, said that researchers are faced with sets of problems that now cannot be addressed by the current generation of high-performance computers. Modeling the actions of a human lung, for instance, could lead to greater knowledge about epilepsy. Yet such a project requires an order-of-magnitude increase in performance over what today's computers could offer.
Pooling the government's resources and making them open to more users would help solve this problem, Marburger said. Large supercomputers should be treated like 'national resources' available to all agencies and their constituents, Marburger said.
In many cases, today's government supercomputers aren't as freely available to other researchers, testified speakers at the hearing. Rick Stevens, director of the National Science Foundation's TeraGrid project, said that current policy bars NSF from giving outside researchers access to its own supercomputing facilities.
The idea of a government agency running a supercomputer as a service for other agencies, as well as for private industry, was addressed by the Energy Department's newly announced supercomputer at Oak Ridge National Laboratory (Click for GCN story).
That computer will be made available for nonclassified research outside the Energy Department, either by academic institutions or by private companies, said Ray Orbach, director of the Energy Department's Office of Science. The laboratory will review potential jobs and pick those with the highest merit. Use of the supercomputer will be free as long as the results are published in a public forum, Orbach said.