Earthquake data powered by next-gen supercomputers
Connecting state and local government leaders
A collaboration among national labs aims to provide an open-access website with datasets on how ground motion affects infrastructure during earthquakes.
Agencies tasked with earthquake resilience and response will soon have access to ground-motion simulation data that depicts seismic wave propagation and projected building damage.
Officials, researchers and engineers will be able to obtain the simulation data through an open-access website in coming months, the Lawrence Berkeley National Laboratory announced Feb. 8. The earthquake models were developed using the Department of Energy’s Summit supercomputer and will be further enhanced with the Frontier exascale system.
“The purpose of our project, which we’ve been working on for [several] years is to be ready for the new generation of supercomputers that … really enable these types of calculations for the first time,” said project lead David McCallen, a senior scientist in Berkeley Lab’s Earth & Environmental Sciences Area and leader of the Critical Infrastructure Initiative.
Previous computers were limited in their “computational horsepower,” making it difficult to represent earthquake events in high fidelity, while supercomputers can process much higher data loads and calculate a billion operations per second, he said.
But access to high-performance systems is limited, so Berkeley Lab in collaboration with the Lawrence Livermore National Laboratory and Pacific Earthquake Engineering Research Center will make the models available on an open-access site where the public can download datasets on earthquake ground motion produced by supercomputers.
The innovative motion data will better help engineers mitigate disastrous effects of earthquakes. For instance, knowing how ground motions from a major earthquake interact with building structures is important for designing and retrofitting structures to be resilient, McCallen said. Simulations also provide “a better sense of how to plan and respond for things with distributed assets like a transportation system or an energy system across the region.”
The software used to develop the simulations, called EQSIM, uses mathematical models to represent an initial faultline rupture followed by the resulting movement of seismic waves, McCallen said. The final step is to simulate how the waves would interact with a building, bridge or relevant critical infrastructure to identify potential stresses and displacements caused by the impact.
Data to create the models is pulled from the U.S. Geological Survey and structural designs and drawings of common infrastructure. EQSIM simulations also fill in data gaps where field devices cannot collect seismic measurements, McCallen said.
“We typically don’t have a lot of instruments right near the faults, which is the area we care most about, so that’s left a knowledge gap and a lack of understanding of the characteristics of strong motions,” he said. With EQSIM, “we can put grid points wherever we want and get estimates of ground motion all over.”
EQSIM and the website are slated for completion in the fall, McCallen said. Researchers also plan to expand simulation models to other regions in the state, according to the announcement.
NEXT STORY: How does ChatGPT differ from human intelligence?