Internaut | Managing a flood of new data
Connecting state and local government leaders
Government information technology managers are constantly inundated with new types of data arriving from an ever-increasing number of sources.
Government information technology managers are constantly inundated with new types of data arriving from an ever-increasing number of sources. It's their job to figure what's worth keeping from each data stream, how to store it, how to access it and how to make the data available to a wide variety of applications.
One type of information already having an impact is data generated by various civil engineering projects. That includes information from road and bridge sensors, water level sensors, smart lighting controls for buildings or public spaces, and even citywide networks of traffic controls, highway signs and monitors along fences and borders.
To understand that growing data flow and associated issues, let’s start with the sensors themselves. They are often transducers. A transducer typically measures energy produced by pressure, movement or heat then converts that energy into something else, such as an electrical impulse that can be recorded as data. New types of sensors implanted in bridges can measure the movement of girders or plates, metal corrosion, and other types of wear. A local system usually collects sensor impulses and converts that information into a specific type of data that can then be sent to a computer.
Those systems produce a variety of data types, many of which are proprietary. But a standard called Transducer Markup Language is becoming increasingly common. It can be used to create a type of XML document that describes data produced by a transducer in a standardized way and includes metadata that describes the system producing the data, the date and time it was collected, and how multiple devices relate to one another within a network or via Global Positioning System coordinates.
Marport Canada Inc.’s SmartBridge is one of the leading systems for that type of data collection.
Of the nation's nearly 500,000 bridges, the Federal Highway Administration cataloged 25.8 percent as structurally deficient or functionally obsolete as of 2006. That doesn't mean they are heading for collapse, but it does mean they need monitoring. Traditionally, that has meant periodic visual inspections. But as the 2007 collapse of the I-35W Mississippi River bridge in Minneapolis showed, visual inspections might not be enough.
The replacement bridge built in Minneapolis contained hundreds of special sensors, many cast right into the concrete. The University Of Minnesota and the Minnesota Department of Transportation monitor the data those sensors collect.
Realizing that there will be a growing demand for such systems, researchers at Clarkson University have developed a prototype bridge sensor that doesn't need a battery. It powers itself via the vibrations of a typical bridge, similar to those flashlights that you charge by cranking or shaking.
Right now, transportation-related sensors are leading the way. But other agencies will soon notice their own flood of sensor data. The Agriculture Department will see more data from crop and livestock sensors. The Energy Department will see more information on energy consumption and how weather and cost affect it. Meanwhile, the Homeland Security Department is already dealing with data from border sensors and video surveillance systems.
Government data-center and network managers can prepare for the flood of sensor data by asking the following questions when systems are installed:
Who is ultimately responsible for the sensor network? Who will maintain it, and who is responsible for troubleshooting it if the flow of data is interrupted?
How will the collected data be moved from the remote site to a receiving facility? Via a government network? University network? Leased lines? If so, leased by whom?
Will the sensor data simply pass through a network on its way to a specific end user (researcher, highway manager, etc.)? Will that end user be solely responsible for collecting and storing the data, or will government IT managers also be responsible for data collection, database development, backups and more?
Is there a service-level agreement associated with the data collection, perhaps covering how often data will be updated, how long it will be stored and how accurately the information will be represented?
Is any sort of data conversion necessary for multiple applications to access and integrate the data?
Each of those issues could result in more work for an IT department. Thus, staffing and system considerations should be part of any integration plan.
Finally, civil engineering projects are producing more than just sensor data these days. As design technologies and site mapping improve, they create new data files for the government to manage, some of which are quite large.
In addition to computer-aided design files, site plans, elevation information and environmental impact reports, a solution called building information modeling creates 3-D datasets that allow users to navigation through all the data associated with a large engineering project, including the management of a facility’s entire life cycle.
Clearly, civil engineering projects will continue to spawn a new flood of data types and system requirements. Being prepared and asking the right questions are essential for anyone who is expected to handle that flood.
NEXT STORY: Air Force Materi'l Command improves alert system