Laser-based mapping tech a boost for troops in Afghanistan
Connecting state and local government leaders
LIDAR's remote sensing technology, which bounces pulsed lasers, produces more accurate maps more quickly than traditional geospatial tools.
Light detection and ranging technology, which has proved invaluable to a variety of domestic mapping projects, is also a potential game-changer in Afghanistan, where troops face some of the world’s most forbidding terrain.
Troop movements are hair-raising, with improvised explosive device attacks so ferocious that broad agency announcements go out for unmanned ground vehicles. Helicopter gunship crews remain vulnerable to shoulder-launched rockets and anti-aircraft fire. Yet until recently, coalition commanders have often relied on dusty Raj-era British terrain maps.
But LIDAR, an optical remote sensing technology, can yield data flows that are orders-of-magnitude quicker, more accurate and clearer than other mapping tools.
Related stories:
Accurate flood mapping is still a work in progress
NASA data lets scientists map forests (and the trees)
LIDAR bounces pulsed lasers, as opposed to the electromagnetic radio waves of radar and sonar, off target objects and surrounding areas of interest to detect their properties. Because it can detect much smaller particles in the atmosphere than radar, which can’t detect things smaller than cloud particles, LIDAR can even be used for aerosol detection.
Some laser radar systems can perform multiple sequential scans over a scene. Others create images in target or mapping modes. In targeting, the sensor continually focuses, saturating a target with laser pulses to generate a high-resolution product. Mapping is wide-area collection, where the laser pans to collect data along a set path.
The National Geospatial-Intelligence Agency (NGA) has been deploying LIDAR in aircraft to map Afghanistan’s entire 647,500 square kilometers. In announcing the ALIRT LIDAR project at the GEOINT 2010 Symposium, Air Force Lt. Gen. John Koziol, director of the Defense Department’s Intelligence, Surveillance and Reconnaissance (ISR) Task Force, heralded the technology’s “amazing capacities” for coverage down to inch-level fidelity.
At that time, a sole Air Force G-3 Gulfstream was involved in NGA's Afghanistan mapping operation, Koziol said. The total number of aircraft ultimately deployed — and nearly the entire program — is classified. But NGA describes its ALIRT LIDAR program as “an airborne 3-D imaging laser radar system optimized for wide-area terrain mapping.” LIDAR mapping was used in the aftermath of Haiti's 2010 earthquake.
In public testimony in March, Defense Advanced Research Projects Agency Director Regina Dugan called LIDAR’s photon-detecting arrays “so sensitive it’s now possible to make range measurements with fewer than 10 photons received, versus tens of thousands,” which means that at times, the technology is capable of locating hidden objects and penetrating tree canopies.
LIDAR can complement, interface with or fuse with conventional maps, tactical computer modeling, full-motion video, hyperspectral imagery and ISR platforms. The Global Positioning System enables the process.
Mathias Kolsch, assistant professor of computer science at the Naval Postgraduate School, helps direct the DOD-backed ISR Task Force at the school’s Remote Sensing Center. Often in consultation with National Security Agency colleagues, he works on LIDAR issues such as anomaly-detection and semantic compression algorithms, computer-vision analysis, and multimedia.
The depth calculation capability of the Afghanistan-mapping technology is a prime attraction for the deployed force, Kolsch said. Mission tasking determines the lasers’ widths and other parameters, but the detection range is from a few centimeters to a meter.
“The most advanced LIDAR sensors for critical depth measurements are called Flash LIDAR, and they do more or less what a flashlight does: send out a big broad pulse of laser light…[with] multiple receivers” for the returns, he said. “Instead of the traditional LIDAR with its single receptor, [these sensors] have a whole detection array — banks of digital [charge-coupled device] cameras — so they simultaneously get multiple depths.”
Current-generation LIDAR can be configured to cover specific structures, vehicles, roadways or terrain. The light it pulses from an aircraft’s sensor can, by some estimates, collect as many as 150,000 data points per second. The data provides location on an X-Y-Z axis, referred to as a point cloud, with millions of individual ground data points.
According to the Naval Postgraduate School, wavelengths range from ultraviolet to visible and infrared, which is at least 1,000 times smaller than radar. By comparison, Canada’s Radarsat satellite has a wavelength of 5.6 cm. Here, smaller is better.
What’s more, satellites are typically a few meters off, Kolsch said. But “with LIDAR-based approaches, you can identify little pebbles on a road — and what’s around them.” Besides discovering military conditions and elevations from 3-D models, the data can be used for detecting change over time to determine whether new buildings have been constructed or roads regraded.
Other mapping modalities being deployed or in late-stage development include the DARPA/Air Force Research Lab’s High Altitude LIDAR Operations Experiment, the Army’s holographic Tactical Battlefield Visualization program, and the Army Geospatial Center’s terrain-charting BuckEye tool, which melds airborne LIDAR with digital color camera imagery.
An NGA imagery scientist, who asked to remain anonymous for security reasons, said that depending on the mission, LIDAR sensors are “bathymetric, topographic and atmospheric…and gather topographic data using different regions of the spectrum.” The resulting data is used to automatically generate high-resolution 3-D digital terrain and elevation models. Overall, the scientist said, LIDAR elevation data supports improved battlefield visualization, line-of-sight analysis and urban warfare planning.
Some research focuses on using LIDAR to perform fully automated feature extractions, 3-D urban modeling and vertical obstruction analysis, the scientist said. LIDAR might eventually perform semi-automated feature extraction and display — for example, for vegetation, buildings and roads.