Livermore opens supercomputer cluster to big data collaborators
Connecting state and local government leaders
The Catalyst supercomputer at Lawrence Livermore National Laboratory is being made available to industry and academic collaborators to test big data technologies, architectures and applications.
Lawrence Livermore National Lab is making its Catalyst supercomputing cluster available to industry, universities and other collaborators to test big data technologies, architectures and applications, the lab announced.
Developed by a partnership of Cray, Intel and Lawrence Livermore, the Cray CS300 high performance cluster is available through Livermore's High Performance Computing Innovation Center (HPCIC).
Catalyst, built for the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing program, features 128 GB of dynamic random access memory per node, 800 GB of non-volatile memory (NVRAM) per compute node, 3.2 terabytes of NVRAM per Lustre router node and improved cluster networking with dual-rail Quad Data Rate Intel TrueScale fabrics.
NVRAM is familiar to anyone who uses USB sticks or an MP3 player; it is simply memory that is persistent and that remains on files even when the power is off, hence "non-volatile," the lab explained.
The increased storage capacity of the system represents a major departure from classic simulation-based computing architectures common at DOE laboratories and opens up new opportunities for exploring the potential of combining floating point focused capability with data analysis in one environment.
Deployed in October 2013, the Catalyst architecture already has begun to provide insights into the kind of technologies the NNSA’s Advanced Simulation and Computing program will require over the next decade to meet high performance simulation and big data computing mission needs.
Current research using Catalyst include developing scalable analysis tools for next generation of genomic sequencing and exploring new deep learning architectures that could have a huge impact on big data analytics.
"Our purpose is to use Catalyst as a test bed to develop optimization strategies for data-intensive computing," said Fred Streitz, director of the HPCIC. "We believe that advancing big data technology is a key to accelerating the innovation that underpins our economic vitality and global competiveness."
Companies interested in access to Catalyst are invited to respond to the posted Notice of Opportunity through the Federal Business Opportunities website and visit the HPC Innovation Center website.