Challenges of data consolidation
Connecting state and local government leaders
While the benefits of data integration and consolidation can be clear, the process is not easy to implement.
Those responsible for optimizing the performance of a federal data center typically consolidate data from system that monitor building management, electrical power and data center infrastructure across the network. These software platforms provide information on energy usage, cooling operation and IT optimization – the information data center managers need to stay within federal mandates and on budget. But while the benefits of data integration and consolidation can be clear, the process is not well understood or easy to implement.
Data must first be consolidated across the network and normalized before it can produce the reporting and analytics that drive actionable decision making. From here, this information can be separated into historical data and alarm notifications. For the purposes of this article, we will be exploring the management of historical data specifically: network connectivity, transfer methodologies and data normalization.
Network connectivity
Network connectivity challenges exist at the local building, campus and agency levels. Because of cybersecurity threats, certain portions of the IP network must be isolated from each other. Additionally, traffic from the building and electrical power management systems must be isolated from the rest of the network containing IP traffic for various web sites and email. This isolation can prevent facility management systems from accessing the Internet for sharing data or services. The challenge is connecting geographically dispersed buildings so that one server in the IP network has visibility to all sites.
Overcoming all of these network issues is possible, but good collaboration is required by facility, IT and cybersecurity teams. To even start this conversation, the organization must see the clear benefit of the change. Understanding how the agency will benefit and focusing on a clear risk versus reward analysis is key.
Transfer methodologies
There are many data transfer methodologies, but data center managers often base their choice on ease of implementation, which can translate into lower cost. The ease of implementation can vary based on network and platform architectures. Here are a couple of typical options used in the industry today.
Web services. With the prevalence of web-based technology, web services are becoming the typical standard for sharing data within control and monitoring systems. Web services transfer data over the HTTP or HTTPS protocol very similar to how a web browser displays text and images from a website. One advantage of using web services is that interaction can be more “real time.” A disadvantage can be when attempting to push data to a destination platform that is not ready to process the data.
Flat file transfer. Using flat file transfer, a system can export data via a typical reporting export feature to a directory on the internal network. Then, a software process can look for newly available files, perform any translations, format for the receiving system and then send the data to an internal or external FTP site. The receiving platform can import this data whenever it has spare bandwidth. Also, this transfer methodology has cybersecurity benefits because only an outbound path through a firewall is needed.
Open protocols. The most basic but still very reliable method is to transfer data via existing open protocols such as Modbus, BACnet or SNMP. Typically, systems that are reading data directly from equipment can share that same data as a server to a receiving client. The client system can easily read data via the open protocol just as if it were the equipment. The client system can also read data from multiple systems, consolidating it into a single platform.
Data normalization
Data normalization involves aligning the naming, units and scaling of different data sources so the information is available to the current visualization system. Most of this alignment is achieved during the Extract Transform Load process, which effectively pulls data from one source, mapping or modifying it, then loading it to a destination.
In some cases, data in one system does not align well with another because of a missing value. If a system has the parameters to calculate that value, the ETL transform process could perform the calculation and align the results.
Some software platforms have this ETL capability natively. If it is native, the software is typically configured with a target IP and authentication parameters. This defines the connection to the standard data exchange of the other system.
Unfortunately, native support typically leaves out critical parameters related to customization or data that does not easily translate. If native support is ineffective, a quality data integration tool is required along with expertise that understands about how each platform stores and shares data.
Data consolidation has to be considered a project and not just a simple task. The upfront planning of all steps involved will help data center managers understand the scope of work. Some data integration projects may require an outside partner with expertise in architecting complex data integrations.
NEXT STORY: Predicting crime through pattern recognition