Agencies loading up on processing power
Connecting state and local government leaders
From NOAA to NASA, teams find critical uses for high-performance systems.
Using wind tunnels to test airframe designs was a pivotal scientific breakthrough. Moving from wind tunnels to computer modeling was another leap.
But as modeling has become more complex, scientists have turned to supercomputers. Nowhere was that more evident than in NASA's return-to-flight effort after the space shuttle Columbia accident in 2003.
'In the wake of the Columbia disaster, NASA undertook a pretty substantial effort of doing a lot of computer analysis to try and understand what went wrong,' said Dave Parry, senior vice president and general manager for Silicon Graphics Inc.'s server and platform group. 'They wanted to find out exactly what the malfunction had been and whether a piece of foam breaking off could cause damage sufficient to ultimately cause the problems on re-entry.'
NASA officials tackled the problem using the Columbia supercomputer, an SGI Altix architecture supercomputer with 10,240 processors in a shared-memory environment.
'It is as though you have a PC with tons and tons of microprocessors in it, but all sitting on a single, shared-memory system, so that you can look at the entire problem as one, holistic thing,' Parry said.
A less powerful computer would force analysts to break a model into pieces, so rather than analyzing a whole plane, only its rudder would be studied.
Supercomputer power also is essential to the climate research that the Commerce Department's National Oceanic and Atmospheric Administration is doing, said Walter Brooks, chief for NASA's advanced supercomputing division at NASA Ames Research Center at Moffett Field, Calif.
In the past, computer models had to focus on specific regions, such as the Caribbean. The modeling application then made assumptions about what was going on elsewhere on the planet. Sometimes it worked, sometimes it didn't.
Higher resolution
SGI's Altix system solves that problem by modeling much larger areas at a much higher resolution.
'We basically run the whole Earth simultaneously, so while we're tracking something such as Hurricane Emily, we're also seeing the typhoons that are evolving near Japan,' Brooks said.
'Sometimes the effects are important and sometimes they're not, but if you ignore them, then you don't know when these global effects drive what's going on closer to home.'
Using supercomputers, NOAA officials in the Geophysical Fluid Dynamics Laboratory can better study climate issues such as El Ni'o and global warming, said Ellis Bailey, Raytheon Co.'s program manager for the lab.
The studies are important because climate change can lead to severe storms, floods or droughts, which can have significant economic impacts. Under its contract with NOAA, Raytheon officials help define the computational requirements for computer modeling.
'Numerical models of this nature are so complex and have to look with a significant amount of detail over such large periods of time ... , ' Bailey said, 'without a supercomputer, this work would be virtually impossible.'
The biggest challenge for a systems integrator working with a supercomputer is maintaining a balance among all the machine's components. Raytheon is able to run fast models, such as a hurricane being tracked, as long as file access, processor speed and memory access are all balanced, Bailey said.
In the past, SGI built supercomputers using microprocessors it had designed and developed along with its own operating system. The new Altix systems use Intel Itanium 2 processors and the Linux operating system.
Use of industry standard components, such as Intel Itanium 2 microchips, speeds building of the supercomputers. SGI has built systems that use as few as two processors and as many as 10,240.
NASA investigators needed that kind of power to effectively study the Columbia accident. With it, SGI's Parry said, 'they were able to ... provide an accurate model so they only had to run one physical test.' Scientists used a cannon to shoot foam at shuttle tiles to verify the model. A simulation accurately replicated what had gone wrong with the shuttle, Parry said.
Other agencies and systems integrators also are turning to supercomputers to help solve complex problems, he said. Homeland Security Department officials are using one to manage airspace. The system uses data from radar sites around the country to track aircraft. In the future, supercomputers will be common in all federal agencies, Parry said.
Doug Beizer is a staff writer for Government Computer News' sister publication Washington Technology.
NEXT STORY: Chips for the next dimension