Accurate flood mapping is still a work in progress
Connecting state and local government leaders
Agencies are still at work gathering data needed to bring the latest GPS technology to bear in creating more accurate flood maps for the United States.
The Missouri and Mississippi valleys have taken a pounding this year from heavy snow melt and spring rains, and 100-year floods seem to be happening with greater frequency.
Maps identifying areas that are likely to be inundated at least once every century – the 100-year flood plain – are used to determine flood insurance rates, regulate development and create plans to mitigate flood damage. But determining where floodwaters will go is a complex and so far inexact process.
“The flood now in Louisiana started with snow in Minnesota,” said David Doyle, chief Geodetic Surveyor for the National Geodetic Service. A variation of a few inches on a map can make a difference in thousands of acres when the water completes that long journey.
NGS provides the foundational data used by the U.S. Geological Survey, the Federal Emergency Management Agency and other agencies for mapping the elevations and contours from the Northern Rockies to the Gulf Coast, as well as the rest of the country, that determines where that water will go.
Current data used to determine elevation can vary much more than a few inches. A 2009 study by the National Research Council found that the USGS National Elevation Dataset for North Carolina varied by about 12 feet from new data produced by an aerial survey.
The technology exists to more accurately determine elevations. But using it on a large scale remains a challenge.
Although the NGS National Spatial Reference System provides the foundation for mapping, data for compiling detailed maps increasingly is provided by companies using aircraft-based systems such as LIDAR (Light Detection and Ranging) that sends pulses of light that are reflected back to sensors from structures, vegetation and the ground. Sensors can collect data from 2,000 pulses per second, said Dave Maune GIS project of Dewberry Companies LLC, a company that analyzes data and also produces maps for FEMA. The cloud of LIDAR data points is analyzed and classified to identify those coming from the ground, those from treetops, etc.
Ground data then undergoes what is called H&H modeling. A hydrologic modeling program analyzes runoff from predictable rainfall levels, taking into account soil types, terrain, vegetation and types of surfaces. This provides data on the amount of water coming into a stream at any point. Hydraulic modeling then analyzes what happens to that volume of water – how fast it flows, where it goes. The result is a map showing floodplains.
Better modeling tools have made the maps better, Maune said. “The flood maps are more accurate that in the past.”
But in the end they are only as accurate as the elevation data they are based on. LIDAR surveys can produce accurate measurements over large areas, but they measure only the distance between an aircraft and the ground. This means that they are only as good as the positioning data for the aircraft, and that is obtained from GPS.
The Global Positioning System is one of the most significant improvements in mapping technology in more than 200 years, Doyle said. Other techniques for mapping and surveying have seen incremental improvements in tools, but “the process is still pretty much the same.” GPS has changed that, making most traditional methods for determining latitude and longitude obsolete.
But measuring height is different story. “The vertical is a big problem,” Doyle said. “This is complex science.”
GPS is a three-dimensional tool, pinpointing locations in terms of relative position from satellites that send timing signals. Those signals can be used to establish latitude and longitude on a mathematical spheroid representing the Earth, but that spheroid has no relationship to the Earth’s gravitation field, which is needed produce accurate elevation data.
Producing accurate elevations using GPS data requires a way to reconcile that data with another model, called the geoid, which is at theoretical representation of where the Earth’s sea level would be if the Earth were not in the way.
“It’s a Holy Grail for us at NGS,” Doyle said of this reconciliation.
But this means first building an accurate geoid model and that requires detailed information about the gravitational field, which changes significantly from one point on the Earth’s surface to another. The current geoid model has errors of from several centimeters to as much as two meters in some places, Maune said.
The NGS GRAV-D program, Gravity for the Redefinition of the American Vertical Datum, is an effort to gather the data to produce a gravity-based vertical datum would be accurate to with 2 centimeters for most of the country. “We’ve been working at it for the better part of 20 years,” Doyle said.
But current budget constraints make this a bad time for large, long-term research jobs. And when finished, the new geoid model still would produce elevation results that are less accurate than traditional methods of differential leveling, with which surveyors using graduated poles and a leveled sightline can achieve accuracies to less than a millimeter.
Traditional methods of surveying can only be done a small scale, however, usually over distances of 150 feet or less, making them impractical for large scale mapping. To make local measurements relevant on a regional or national scale, some baseline is needed to tie an elevation in Maryland with that in Iowa or California. A height of “10 feet” means 10 feet above what?
Elevation usually is expressed in terms of mean sea level, but the Earth has no such thing as one sea level. It is constantly changing from time to time and place to place, so most height systems are local, regional or at best national. New York City uses at least 12 different height standards, Doyle said. The North American standard for mean sea level is based on a tide gauge at the mouth of the St. Lawrence River. This is an arbitrary standard, however, and has little to do with actual sea level at New York, Baltimore or San Diego (not to mention Topeka and Indianapolis, which have not seen a sea for millions of years).
Figuring mean sea level at any one point is an arduous task, because it varies according to the time of day, the season, the weather and the alignment of the Earth, moon and sun, which goes through an 18.6-year cycle. This means that accurately determining mean sea level at any point requires nearly 19 years’ worth of data.
With so many local standards for elevation now in use, coming up with a global standard is “something for the future,” Doyle said. There is hope that some agreement could be reached in 10 years or so, but getting everyone to move from current measurements could be a tough sell.
In the meantime, producing the floodplain maps used by insurance companies, emergency planners and civic planners remains a challenge. FEMA and the National Oceanic and Atmospheric Administration (which includes both the NGS and the USGS) commissioned the 2009 National Research Council study on costs and benefits and updating the mapping system. The FEMA map modernization program, which is using data compiled from 2003 through 2008 to produced digital flood maps for 92 percent of the continental U.S. population, but the council reported that as of 2009 only 21 percent of the population has maps that meet all of FEMA’s data quality standards.
NEXT STORY: A GIS mobile app sampler