Models ready for their close-ups
Connecting state and local government leaders
A team of climate research centers are sharing a $1.4 million grant from the National Science Foundation to develop the next generation of climate computer models, which will take advantage of powerful new supercomputers to track and predict climate change.
MEMBERS OF A TEAM of climate research centers are sharing a $1.4 million grant from the National Science Foundation to develop the next generation of climate computer models, which will take advantage of powerful new supercomputers to track and predict climate change.
The emergence of computers capable of performing 1,000 trillion operations/sec will enable scientists to efficiently run climate simulations at a resolution several degrees of magnitude greater than now possible, which will enhance accuracy, offer finer detail and incorporate more variables, scientists say.
'The limiting factor to more reliable climate predictions at higher resolution is not scientific ideas but computational capacity to implement those ideas,' said Jay Fein, a program director at NSF's Division of Atmospheric Sciences.
Supercomputers are starting to achieve the capacity to do that type of modeling.
Petaflop performance ' or 1,000 trillion operations/sec ' is rare. The first computer to achieve that speed was demonstrated for the National Nuclear Security Administration in June. In anticipation of such computing power becoming more commonly available, researchers at the University of Miami's Rosenstiel School of Marine and Atmospheric Science; the National Center for Atmospheric Research (NCAR) in Boulder, Colo.; the Center for Ocean-Land-Atmospheric Studies in Calverton, Md.; and the University of California at Berkeley will share the NSF grant to generate new climate models.
Scientists have been using computer models to forecast weather variables since the days of vacuum-tube computers in the 1950s. Advances in technology eventually led to models with longer timescales to predict climate change, and the science took a leap forward in the 1980s with the coupling of ocean and atmospheric models. Today's general circulation climate models use what NCAR calls a staggeringly complex array of variables that range from hourly events, such as frontal changes, to developments that take place over centuries, such as glaciation.
NCAR has developed one of the most sophisticated, fully coupled atmospheric-ocean models, called the Community Climate System Model (CCSM). The National Oceanic and Atmospheric Administration uses it, and NSF and the Energy Department funded it. It is an open-source model available to any researcher. The first version was released in 1998; the current version, CCSM2, was released in 2002 and successfully re-created climate patterns from 1870 through 2000.
Next level
To produce a picture of a single day of the world's climate, the model must perform 700 billion calculations, according to NCAR. But all that computing power produces a picture with a peak resolution of only 1 degree by 1 degree, or an area of about 3,900 square miles.
Taking climate modeling to the next level so it can include elements such as eddies in ocean currents, which are narrower than 100 miles, will require enormous improvements in resolution. Other critical elements for climate modeling, such as measuring carbon dioxide levels and subtle feedback loops, will require even more calculations.
University of Miami meteorologist Ben Kirtman, a researcher in the NSF-funded program, has developed a strategy called interactive ensembles that isolates the interactions between weather and climate. The ensembles are being applied to CCSM. His research is to be the basis of a pilot program for implementing more computationally intense systems that require greater computing power.
One of the tools that will be available for that type of work is Roadrunner, the NNSA supercomputer being set up at DOE's Los Alamos National Laboratory. As the first computer to break the petaflop barrier, it more than doubled the speed of the next-fastest computer, the IBM BlueGene/L housed at the Lawrence Livermore National Lab. Although Roadrunner's primary mission will be to simulate the decay of nuclear materials for weapons research, scientists will also use it for research into climate change, astronomy, energy and human genome research.
NEXT STORY: VA joins Internet2 consortium