In 2014, big data technology goes mainstream — and real-time
Connecting state and local government leaders
The big data ecosystem of tools will continue to evolve as government managers seek faster access to data and advanced analytics.
Government managers want faster access to data and analytics as they address a host of issues in 2014, including cybersecurity, fraud detection, crime prevention, medical research, weather modeling and situational awareness.
As a result, big data tools that deliver rapid access, real-time analysis and pattern recognition will be in big demand in 2014, industry experts say. To carry that out, ‘must-have’ tools for agency data managers include: Apache Hadoop, the open-source framework that breaks up large data sets and distributes the processing work, and NoSQL databases, which can handle unstructured data and deliver fast query speeds across large data sets. Next year these tools will become more widely deployed working in conjunction with traditional IT systems and databases,” say government IT watchers.
Big data technology “will also become increasingly domain specific, with tools and capabilities tailor-made for specific types of data,” said Abe Usher, chief innovation officer with the HumanGeo Group, a technology provider and security consulting firm. Three sources that generate large volumes of data and are being targeted by these tools include social media data, server logs for information security and Web clickstream data.
Cloud will be the underlying technology these tools run on, which will make it easier to procure hardware needed to process such large data sets, said Chris Biow, principal technologist and technical director with MongoDB, a provider of an open-source NoSQL database.
To take advantage of these trends, government IT managers will have to understand how to apply cloud services, bulk analytics offered by Hadoop and the interactive analytics of NoSQL databases appropriately, Biow said.
Hadoop goes mainstream
Hadoop has gone through a rapid maturing process over the past year. Initially, the open-source framework had a narrow set of features and was used by early adopters in defense or health research sectors. Many organizations were reluctant to put it into production because its security features were underdeveloped, said Ely Kahn, co-founder and vice president of business development with Sqrrl, a developer of a commercial version of the secure, scalable Accumulo open-source NoSQL database.
Now there are commercial versions of Hadoop from Hortonworks and Cloudera with improved security features. More secure and scalable databases are also available, including Accumulo, which was developed at the National Security Agency and turned over to the open-source community. The system runs on top of Apache Hadoop.
Moreover, analytics and interfaces in big data tools are becoming more standardized. Historically, “you had to be a Java programmer or NoSQL database expert to get any analytics out of Hadoop,” Kahn said. But now Hadoop is adopting SQL-based application programming interfaces, the type of databases that most programmers are used to.
Other signs of Hadoop emerging as a mainstream technology include its “ability to know the lineage of any data in the system, including not just where the data came from but who acted on it and what changes were made,” said Bob Gourley, founder and chief technology officer of CrucialPoint, a research and advisory firm.
Hadoop is also moving rapidly toward handling real-time workloads. When Hadoop was first created, its focus was on "batch" workloads. With a move into the real-time space more workloads can be handled. Yet another improvement is the ability to leverage search capabilities like Lucene directly over Hadoop holdings. All these improvements are giving rise to the concept of an "enterprise data hub,” where data can be presented in such a way that any application or legacy application can also access it, Gourley said.
Analytics at the speed of thought
In-memory processing will continue to make inroads as managers seek faster access to information to make reliable and smart decisions, said Charles Lewis, principal information systems engineer with Mitre, a research organization that helps government agencies assess technology. Retrieving massive amounts of data from disk storage slows down the analytic process.
To address this problem, database admins usually preprocess data through query sets or aggregated tables so the computer can deal with a smaller number of records. In-memory processing loads all relevant data into fast RAM memory, so the data does not have to be accessed from disk storage. Plus, it’s possible to see the data at a deeper level of detail — “analytics at the speed of thought,” Lewis said.
The downside is that only so much data can be stored in memory, whereas a hard drive can store petabytes of data. A fix would be to perform critical processing with in-memory databases and also use tools to swap data from disk to memory and back to disk. Agencies that need real-time information about people entering the country, such as Customs and Border Protection, the Homeland Security Department or the Transportation Security Administration, would be primary candidates for in-memory processing systems, Lewis noted. Oracle, SAP and NoSQL databases such as MongoDB provide in-memory processing.
“Solutions for in-memory computing architectures are well thought out, can be very secure and can also be designed for long-term storage that spans multiple physical locations for backup,” said CrucialPoint’s Gourley. “Traditional storage (disk and tape) will be with us for a long time, but these new in memory methods are very compelling.”