Fixing the flaws in North American maps
Connecting state and local government leaders
Twentieth-century mapping techniques have not kept pace with 21st-century geospatial technology, and the National Geodetic Survey has begun a 10-year program to update the National Spatial Reference System that the nation’s mapping and surveying are built on.
Advances in geospatial technology have, in a relatively short time, outpaced the standard terrestrial mapping systems that defined the world’s geography for centuries. The National Geodetic Survey is putting that new technology to use, embarking on a 10-year program to rewrite the map of North America.
The National Spatial Reference System, which is the basis for most mapping and surveying in the United States, primarily consists of 1.5 million passive markers installed by NGS during the past 200 years. The markers provide reference points for surveyors, but the accuracy of their positioning varies, depending on the state of the art at the time they were set. The system, last updated in the 1980s, contains errors of as much as 2 meters in latitude and longitude and about 1 meter in elevation.
State-of-the-art technology, with global navigation satellite systems, can place a position to within a centimeter using a few hours of data collections, and that process is expected to speed up significantly during the next 10 years.
“The future of positioning is GNSS,” states a recent NGS white paper, titled “Improving the National Spatial Reference System.”
Related stories:
New space race is on for satellite positioning systems
Google to bring NOAA data to new heights
NOAA expands geodetic reference network
Better accuracy can have a significant impact. A one-meter mistake in elevation in a coastal region such as Louisiana could be the difference between a hurricane evacuation route being passable or underwater and could also result in water flowing the wrong way when dikes and channels are designed.
Satellite positioning technology that will soon be available to consumers will have a degree of accuracy that will put them out of alignment with existing maps, which are not as accurate, said NGS Chief Geodesist Dru Smith. That could, for example, result in drivers being directed by a car navigation system into the wrong lane in traffic.
To address those issues, the National Oceanic and Atmospheric Administration, NGS' parent agency, convened the first federal geospatial summit in May.
“No longer are passive marks the way of doing business,” Smith said. “The geodetic control of the future is the orbits of the satellites in the sky.”
Those orbits, coupled with increasingly precise measurements of Earth’s gravity field and computing tools for massaging data, can provide more accurate positioning than what is now available from the National Spatial Reference System.
NGS already uses satellite positioning data from the Global Positioning System to provide survey-grade positioning that is more accurate than the standard used for most mapping. The goal of the modernization program is to bring the entire system into the 21st century, aligning the National Spatial Reference System with global satellite systems, with constantly updated data provided from satellites, combined with increasingly accurate models of the Earth and its gravity field.
“We have the technology in place to do it,” Smith said. “We need to bring the world along.”
That will not necessarily be simple. It will make millions of existing maps, charts and other records that federal, state and local governments now depend on obsolete. So NGS is moving slowly, phasing in the new standards during the next 10 years before abandoning the existing system.
Advances in mapping, positioning and other geospatial technology have resulted in a proliferation of sensors and systems that provide information, much of it in digital formats. A lot of geospatial data exists today only in digital form. Although a good deal of it soon might be obsolete, it is valuable for comparative uses, and the Library of Congress is engaged in an effort to help preserve that data.
Whether in maps, aerial imaging or other forms, the data is used to track changes in the Earth’s geography, structures, land use and environment, said William Lefurgy, digital initiatives project manager at the library’s National Digital Information Infrastructure and Preservation Program.
“There is so much of it cranked out every day that managing it is becoming a problem,” Lefurgy said. “There is a growing awareness that you not only need to preserve what you have now but also the material you had yesterday.”
Because the data is being produced in a variety of digital formats, the library and Columbia University are creating a Web-based clearinghouse for sharing best practices in preserving digital data.
World View
Geodesy is the science of studying the Earth’s size and shape, its gravity, and its magnetic fields. At the National Geodetic Survey, “we are concerned with precisely positioning things on the Earth,” Smith said.
“Precisely” is a relative term because of errors that creep into measurements and the constantly changing surface of the Earth. Land shifts and elevation change as land rises and subsides. Sea level, the traditional benchmark for measuring elevation, is not constant. It changes from hour to hour and place to place, and more significantly, it is rising globally.
The existing U.S. geodetic baselines — the references from which measurements are made — are the North American Datum of 1983 for latitude and longitude and the North American Vertical Datum of 1988 for elevation. Those were updated from the previous standards that had been established in the 1920s.
“It was quite good for the era,” Smith said of the 1920s references. But “the technology was pretty primitive by modern standards.” By the 1960s, techniques for electronic distance measurement and early satellite measurements were advancing the state of the art, and by the 1980s, the data had to be updated to correct errors of as much as hundreds of meters in some positions.
That occurred “just in time to be obsolete when GPS went up,” Smith said.
Although the baselines in use are a significant improvement over those of the 1920s and are good enough for most uses today, the submeter accuracy that soon will be available on handheld consumer positioning devices will result in “glaring errors to general users,” according to the NGS white paper. Moreover, a system of static reference points, no matter how accurately positioned, will not address changes in the Earth’s surface over time.
“Decisions made based on marks set in a subsiding crust may yield unintentional harm to life or property,” the NGS white paper states. “For example, decisions about building homes in flood-prone areas or declaring roads to be high enough to serve as evacuation routes must be based on accurate heights or the results can be devastating.” Areas of largest concern today are low-lying areas, including the Gulf of Mexico, Chesapeake Bay and California agricultural regions.
Global navigation satellites orbit around the center of the Earth’s mass, which now has been located to within less than a centimeter. Those precisely measured orbits enable this generation of GPS devices for general consumer use to be accurate to within a few meters instantaneously. But NGS can provide more accurate measurements by massaging the data and comparing it with measurements from a nationwide network of about 1,400 precisely positioned and continuously operating GPS receivers.
NOAA operates about 5 percent of the permanent receivers in this network, called Continuously Operating Reference Stations. The rest are operated by universities, state transportation departments and other organizations. Surveyors who need accurate positioning send their GPS data to NGS’ Online Positioning User Service, and OPUS figures the position based on its relationship to the permanent CORS receivers.
“What we do is massage the data using precise orbits, clock corrections and accounting for other phenomena like atmospheric conditions” and use it with terrestrial measurements of the gravity field, NGS’ Smith said.
The actual definition of height depends on knowing the strength of gravity at a point, and gravity’s strength at a given point depends on the distance from the center of the Earth and the distribution of the Earth’s masses, especially near the point in question. Precise measurements of the gravity field help determine the height of a surveyed point. Although the human body is not sensitive enough to detect such small changes in the gravity field, flood patterns are affected by them.
Final Measures
That system will replace the existing system of static markers to take advantage of the strengths of global satellite systems and NGS’ own expertise with modeling the gravity field. However, many of the gravity measurements across the country are out-of-date and in need of a consistent, coordinated resurvey. NGS has initiated the Gravity for the Redefinition of the American Vertical Datum project to resurvey the gravity field. It is expected to take about 10 years and cost $40 million to complete before its data will be ready to update the National Spatial Reference System.
In contrast, merely updating the existing system by resurveying benchmarks that use the same techniques from the 1980s would cost an estimated $200 million and would not solve any of the problems that result from using a system of passive marks. NOAA has estimated that the modernization program could produce benefits of $4.8 billion in 15 years, including $2.2 billion in savings from improved floodplain management.
The federal, state and local planners that use this information need to be able to manage and access it over time so that changes can be tracked, but the rapid changes in the technology for gathering and recording geospatial data make it difficult to maintain and access. The preservation clearinghouse that the Library of Congress and Columbia University are establishing will be a source for best practices already in use for maintaining data.
“The data includes both current and legacy information about geography and structures, land use and environmental measurements,” LOC’s Lefurgy said. “Quite a bit of it is photographic material.”
Because there are many different types of data, there are silos of expertise in managing it. At a meeting on preservation of geospatial data hosted by LOC last year, users complained that there was no easy way to take advantage of that expertise.
“Everyone agreed that there is a lot they could learn from each other,” Lefurgy said.
The clearinghouse will not seek to reinvent the wheel but will focus on sharing existing best practices to enable curators to take best advantage of the state of the art.
“There probably still is a lot of work to be done in developing best practices,” Lefurgy said. “A lot already has been done, but nobody is going to say they have [all] the answers.”
The new geospatial standards that NGS is putting in place are accurate enough that this could well be the last time that they have to be updated, Smith said. Additional precision undoubtedly could be squeezed out of present measurements, but with the center of the Earth’s mass located to within less than a centimeter and the ability to accurately model the gravity field and satellite orbits, it is unlikely that any more order-of-magnitude changes will be made, he said.
NEXT STORY: Toughbook 31 has the brains to go with the brawn