How big data is remaking the government data center
Connecting state and local government leaders
The spread of big data tools and technologies is not only altering the way government data is being analyzed, it is reshaping the data center itself.
The big data trend is exerting its influence almost everywhere data is being mass produced, from the high-performance computing arena to the operations and planning departments of the nation’s big cities.
Yet another area where big data is having a substantial impact has received less attention: in the data center itself, where technology managers say big data is gradually reshaping traditional data systems and practices.
Big data refers to the problem of processing very large data sets that defy conventional data management and analysis technologies. The growth of these data sets, and the need to extract value from them, has compelled agencies to start using tools such as the Apache Foundation’s Hadoop distributed computing framework, columnar databases and other big data management solutions.
The adoption of these technologies, in turn, is leading to a gradual restructuring of the data center, including the move to more converged IT infrastructures, alterations in data center traffic patterns and changes in the basic economics of storage.
Agency interest in big data management is hardly surprising giving the spectacular growth of data. Van Young, big data and cloud solutions strategist at HP, citing IDC figures, noted that the global data population is projected to reach 40 zettabytes (1 billion terabytes) by 2020, compared with 1.8 zettabytes in 2012.
At the agency level, the struggle to manage data on that scale has already begun to tax conventional IT systems, and it is encouraging data center managers to adopt solutions that were considered exotic only a couple of years ago. “You need to deploy some newer technology,” Young said, who spoke recently at a Public Sector Partners Inc. big data conference.
Hadoop as change agent
Hadoop is among the more notable of the recent arrivals. The open source software framework includes the Hadoop Distributed File System (HDFS), which distributes large data sets across the servers in a Hadoop cluster. Another key Hadoop application, MapReduce, provides distributed data processing across the cluster. This structure keeps data and processing resources in close proximity within the cluster.
A Hadoop cluster’s servers are typically interconnected via gigabit Ethernet or in some cases 10 gigabit Ethernet.
All of these new technologies within the Hadoop architecture are leading the way to a more converged data center infrastructure.
“Hadoop, when you think about it, is the combination of storage, servers and networking together — and the cluster communication in between,” said Young, who contrasted Hadoop’s approach to the previous practice of managing servers and storage in individual silos.
“Hadoop changed the landscape of data center infrastructure,” Young said. “Converged infrastructure is really the key.”
The impact of Hadoop, however, varies from agency to agency. Peter Guerra, principal in Booz Allen Hamilton’s Strategic Innovation Group, said many federal agencies are testing the Hadoop waters. Some agencies are operating Hadoop pilots along side what their data centers normally run on a day-to-day basis.
Changes in data center ecosystem
But Hadoop has proven more transformative in other agencies, particularly those in the Department of Defense, where Guerra said some federal organizations are opting for Hadoop over traditional network-attached storage (NAS) and storage-area network (SAN) technologies.
“We have seen clients go away from a NAS and SAN type architecture, favoring large Hadoop clusters instead for long-term data storage needs,” Guerra said, calling it a pattern that occurs among agencies that deploy Hadoop as enterprise technology in mission-critical areas.
Storage isn’t the only aspect of the data center affected by the big data shift, however. Robert Wisnowski, a big data storage and cloud solutions strategist at HP, said big data is also changing traffic patterns in the data center. Hadoop nodes in a cluster communicate among each other in an east-west pattern instead of north-south, the latter a pattern that is more typical in traditional data centers.
“So this is something to keep in mind and consider as you are looking at the impacts to your data center,” Wisnowski said. “I want to think about flatter, simpler types of data center fabrics to improve performance and reduce that latency and reduce the cost.”
In other cases, however, big data systems can exploit data center upgrades that evolved independently of the big data trend. Rich Campbell, chief technologist at EMC Federal, said the majority of the company’s federal customers are transitioning from GigE to 10GigE.
Those upgrades were not necessarily planned with big data in mind, he said, but the technology will be able to leverage the new networks just the same. Similarly, agencies that have virtualized servers offer a resource that big data can tap. “They don’t need more servers, they just use them in a different way,” he said.
Yet another impact of big data technology on the data center: the potential to reduce cost. A traditional enterprise data warehouse, for instance, uses extract, transfer and load (ETL) tools to integrate data from various sources and shuttle the data to a warehouse.
Guerra said this process can prove complicated, time consuming and expensive. Hadoop, in contrast, allows for the integration of data prior to analytics. In some circumstances, this can reduce an agency’s use of data integration and ETL tools, which Guerra said saves on software licensing costs.
The converged big data infrastructure also saves money. Wisnowski said the act of bringing together storage and servers lets IT organizations reduce costs. And converged systems, he said, do more in a smaller footprint, trimming power and cooling expenses.