Their goal is to post spatial data for Web and Giles users
Connecting state and local government leaders
Doug Nebert and his staff at the U.S. Geological Survey want to make it easier to serve up spatial data on the World Wide Web. With agencies up to their elbows in Web construction projects, Nebert said, the ones that collect spatial data are wondering when they'll find time to provide Z39.50 servers, too.
Doug Nebert and his staff at the U.S. Geological Survey want to
make it easier to serve up spatial data on the World Wide Web.
With agencies up to their elbows in Web construction projects, Nebert said, the ones
that collect spatial data are wondering when they'll find time to provide Z39.50 servers,
too.
An executive order requires them to document any new geographic data sets they produce
after January 1995, following the Content Standards for Digital Geospatial Metadata
adopted by the Federal Geographic Data Committee [see story, Page 47]. The
executive order also requires posting descriptions of the new data sets to an on-line
clearinghouse so the public will know the data exists, how complete and accurate it is,
and how to get a copy.
Nebert, former chief of the USGS Water Resources Division's Spatial Data Support Unit,
is responsible for designing and building the spatial data servers for the Federal
Geographic Data Committee (FGDC). By early summer, he expects to post setup instructions
and a selection of software at a Web clearinghouse site so agencies can build their own
servers.
Accessing spatial data on the Web will require both browsing capability and Z39.50
search protocol software; neither is sufficient alone. "Search and browse are very
complementary features, and the Web is biased toward browse," Nebert said.
In the clearinghouse environment, spatial data must conform to the application profile
known as GEO, which incorporates the FGDC's metadata standards. GEO is a specialized
application of the government's Z39.50 service, the Government Information Locator
Service. All agencies are supposed to have created GILS catalogs of their data for public
access.
One of GEO's metadata fields will be a uniform resource locator link to the spatial
data set itself, which means users could simply click on that URL field to download the
data or an order form.
Nebert and his staff are starting to develop a map interface with Sun Microsystems
Inc.'s Java applet language. "We're waiting for browser technology to catch up a
little bit and investigating what Java can do," he said. For example, users might
click on a map in several places to define a polygon around a search area such as a river
basin.
"What we're doing is really bootstrap," Nebert said. "We'll try to ride
on other protocols and impose as small a burden as possible on what people already are
doing."
He said the National Geospatial Data Clearinghouse will have a distributed architecture
to ensure accuracy, because agencies that own the spatial data also should maintain the
metadata that describes it. "If people just send us the metadata," he said,
"they may never remember to send us updates as they occur."
Nebert also plans to provide sample setups for translating Z39.50 queries into queries
that can be understood by agencies' own database server software, whether in relational,
flat-file, Postgres or other formats.
"You also need to think about how you'll package your data and how large it
is," Nebert said. He advised managing data by logical geographic areas so that no
data set is larger than 3M.
More information is available at the FGDC clearinghouse home page at http://fgdc.er.usgs.gov.
NEXT STORY: The new law will cannibalize IDIQ contracts