Game generates training data for supercomputer mapping coral reefs
Connecting state and local government leaders
Researchers at NASA’s Ames Research Center are combining remote sensing data with an online game to produce training data for a machine-learning algorithm running on a supercomputer.
To get a better idea the changes impacting coastal coral reefs, researchers at NASA’s Ames Research Center are combining remote sensing data, satellite imagery, supercomputer-powered machine learning and crowdsourced gamification. The NeMo-Net project aims to improve scientists’ ability to remotely sense and analyze changes in underwater ecosystems caused by rising ocean temperatures, pollution and ocean acidification.
With funding from NASA’s Earth Science Technology Office, Ved Chirayath a scientist at the NASA Ames Research Center, first developed the FluidCam and fluid lensing software to collect video through the ocean’s surface. The FluidCam is a high-performance digital camera with 16 cores of processing power and about a terabyte of memory mounted on an orbiting CubeSat. It processes megabytes of imaging data per second, Chirayath said in a video interview.
The fluid lensing software on the camera removes the refraction and distortion in images of the sea floor caused by surface sunlight and waves, allowing scientists to see objects at the centimeter scale 10 meters below the surface.
Multispectral Imaging, Detection and Active Reflectance, meanwhile, provides real-time video from data on an object’s reflectance. MiDAR uses an array of LED emitters on drones or autonomous underwater vehicles to illuminate targets far below the surface and to measure their spectral reflectance, which is then combined with the FluidCam video to produce 3D multispectral scenes and high-resolution underwater imagery.
The final piece of the puzzle is NeMO-Net, the Neural Multi-Modal Observation and Training Network for Global Coral Reef Assessment. The machine-learning technology exploits high-resolution data from FluidCam and MiDAR to improve low-resolution sensing data from aircraft and satellites, Chirayath explained in a paper on the technologies.
NeMO-Net also features an interactive video game by the same name that challenges players to identify and classify images of coral. Players travel the ocean in their own research vessel, the Nautilus, and colorize 3D and 2D images of the ocean floor.
Interactive tutorials train players on domain-specific knowledge, and the game periodically checks their labeling against pre-classified coral imagery to improve their classification skills, a paper on the project explained. Players can rate other players’ classifications and unlock rewards as they label items in shallow marine environments.
As they play the game, players’ actions help train NASA's Pleiades supercomputer to recognize corals, even those images taken with instruments less powerful than FluidCam and MiDAR. The supercomputer uses machine learning to abstract knowledge from the coral classifications players make by hand so that it can eventually classify images on its own, NASA officials said.
As more people play NeMO-NET, Pleiades' mapping abilities will improve until it can classify corals from low-resolution data, which in turn will allow scientists to more readily see what is happening to coral reefs and find ways to preserve them.
"NeMO-Net leverages the most powerful force on this planet: not a fancy camera or a supercomputer, but people," Chirayath said. "Anyone, even a first grader, can play this game and sort through these data to help us map one of the most beautiful forms of life we know of."
In fact, NeMO-Net was tested by fourth-graders studying coral reef ecosystems at the Town School for Boys in San Francisco. The students identified several bugs and offered valuable feedback about ease of use and replayability.
NeMO-Net is available for iOS devices and Mac computers in the Apple App store, and a version for Android systems will be released soon.