Is Hadoop the death of data warehousing?
Connecting state and local government leaders
To what extent is big data changing the traditional data analytics landscape?
The Hadoop ecosystem has exploded in the last three years with major IT vendors announcing a connector to Hadoop, an augmentation on top of Hadoop or their own “enterprise-ready” distribution of Hadoop. Given that Hadoop is on such an exponential rise in adoption and its ecosystem is expanding in both depth and breadth, it is natural to ask whether Hadoop’s ascension will cause the demise of traditional data warehousing solutions.
Another way to put this question is to look at it in a bigger context: To what extent is big data changing the traditional data analytics landscape?
Data warehousing is a set of techniques and software to enable the collection of data from operational systems, the integration and harmonization of that data into a centralized database and then the analysis, visualization and tracking of key performance indicators on a dashboard.
A key difference between data warehousing and Hadoop is that a data warehouse is typically implemented in a single relational database that serves as the central store. In contrast, Hadoop and the Hadoop File System are designed to span multiple machines and handle huge volumes of data that surpass the capability of any single machine.
Furthermore, the Hadoop ecosystem includes a data warehousing layer/service built on top of the Hadoop core. Those services on top of Hadoop include SQL (Presto), SQL-Like (Hive) and NoSQL (Hbase) type of data stores. In contrast, over the last decade, large data warehouses shifted to use custom multiprocessor appliances to scale to large volumes like those from Netezza (bought by IBM) and Teradata. Unfortunately, those appliances are very expensive and out of reach for most small- to medium-sized businesses.
With this background and context it’s natural to ask: Is Hadoop the death of data warehousing?
To answer this question, it’s important to divide the techniques of data warehousing from the implementation. Hadoop (and the advent of NoSQL databases) will auger the demise of data warehousing appliances and the “traditional” single database implementation of a data warehouse.
Evidence of this can be seen with Hadoop vendors like Cloudera billing its platform as an “enterprise data hub,” in essence subsuming the need for traditional data management solutions. Similar sentiment was expressed on ReadWrite.com with a recently published article entitled, “Why proprietary big data technologies have no hope of competing with Hadoop.” Likewise, a recent Wall Street Journal article described how Hadoop is challenging Oracle and Teradata.
And the Hadoop or NoSQL ecosystem is still evolving. Many big data environments are choosing hybrid approaches that span NoSQL, SQL and even NewSQL data stores. Additionally, there are changes and potential improvements to the MapReduce parallel processing engine on the horizon like Apache’s Spark project. So, while this story is far from over, it is safe to say that traditional, single server relational databases or database appliances are not the future of big data or data warehouses.
On the other hand, the techniques of data warehousing to include Extract-Transform-and-Load (ETL), dimensional modeling and business intelligence will be adapted to the new Hadoop/NoSQL environments. Furthermore, those technologies will also morph to support more hybrid environments. The key principle seems to be that not all data is equal, so IT managers should choose the data storage and access mechanism to best suit the usage of the data. Hybrid environments could include key-value stores, relational databases, graph stores, document stores, columnar stores, XML databases, metadata catalogs and others.
As you can see, this is not really a simple question and therefore does not lend itself well to a simple answer. Nevertheless, in general, while big data will change the implementation of data warehousing over the next five years, it will not obsolete the concepts and practice of data warehousing.
What does this mean for the federal government’s huge investments in data warehouses?
First, when the capacities of the current data warehouses are exceeded, the data warehouses will be migrated to a Hadoop-based, multimachine or a cloud-hosted solution. Second, instead of a one-size-fits-all approach, organizations will look to tailor their big data volumes to hybrid storage approaches.
Michael C. Daconta (mdaconta@incadencecorp.com) is the Vice President of Advanced Technology at InCadence Strategic Solutions and the former Metadata Program Manager for the Homeland Security Department. His new book is entitled, The Great Cloud Migration: Your Roadmap to Cloud Computing, Big Data and Linked Data.