NSF projects bring supercomputing to mainstream research
Connecting state and local government leaders
The National Science Foundation announced high-performance computing projects aimed at expanding supercomputing to researchers in the medical, social science and education communities.
The National Science Foundation announced two new supercomputing projects designed to expand access to high-performance computing power for researchers working in fields where supercomputers have not traditionally been used, including the medical, social science and education communities.
The new projects, named “Bridges” at the Pittsburgh Supercomputing Center and “Jetstream” at Indiana University’s Pervasive Technology Institute (PTI) and the University of Texas at Austin's Texas Advanced Computing Center, are designed to address demands for more powerful computing resources beyond the science community.
Irene Qualters, division director for advanced cyber infrastructure at NSF, said the programs would push science forward “by exploiting interactive and cloud-based computing paradigms."
The platforms offer high-performance computing features and functions, she said, including large memory nodes and virtualization services that “offer a PC-like experience via the cloud.”
The two new programs will be supported by NSF's eXtreme Digital (XD) program, a set of digital resources and services for open science research. In addition, NSF’s Extreme Science and Engineering Discovery Environment will supply both projects user services as well as training from NSF’s infrastructure, software and applications experts and scientists, the NSF said.
Bridges will help tackle problems in genetics, the natural sciences and the humanities where scientists are working on problems affected more by variations in data volume than supercomputing speed or performance. The program will offer scientists in these fields a menu of memory, bandwidth and computing power scaled to their research, NSF said.
"Bridges will bring supercomputing to nontraditional users and research communities,” said Nick Nystrom, director of strategic applications at the Pittsburgh center. “Second, its data-intensive architecture will allow high-performance computing to be applied effectively to big data. Third, it will bridge supercomputing to university campuses to ease access and provide burst capability."
The Jetstream project, hosted by Indiana University's PTI, is designed to add cloud-based computation to cyber infrastructure research.
Using Jetstream, researchers will be able to create virtual machines that look and feel like their lab workstation or home machine, according to NSF, but are operating at thousands of times the computing power.
Craig Stewart, PTI executive director for research technologies at Indiana University, said the Jetstream cloud will link the existing NSF cyber infrastructure with researchers and students who are new to NSF XD program.
"Jetstream will give researchers access to cloud computing and data analysis resources interactively, when they need them,” he said, adding that it should be “of particular interest to researchers analyzing ‘born digital' data with research needs that are more suited to cloud computing than the traditional supercomputers that have been the mainstay of NSF-funded cyber infrastructure in the past."
The NSF believes the new systems will also make working with supercomputers easier for mainstream researchers.
Where traditional supercomputing uses a batch queuing system, requiring computation to occur when processors are available, for example, the new programs let users conduct more research on demand and pose "what-if" questions to explore problems in new ways.
NSF said Bridges and Jetstream will be online by early 2016.