Accelerating time to value with DataOps
Connecting state and local government leaders
Once the right technology, infrastructure and people are in place, agencies can begin harnessing their data to create significant value.
Over the past several years, enterprises have begun harnessing their data to create significant enterprise value. The range of applications is wide, from auto companies determining customers’ histories to anticipate their needs, to pharmaceutical firms getting effective new drugs to patients faster by analyzing R&D data.
Government agencies, however, have had a tougher time jumping into the data revolution due to structural and policy challenges that hinder collaboration. They also face difficulties with validating the technology, skills shortages, security concerns, soaring volumes of structured and unstructured data in different formats and a patchwork of siloed IT systems. However, the federal government is making strides, thanks to the Federal Data Strategy that clearly articulates the mission of leveraging the full value of federal data for mission, service and the public good.
While they may be playing catch-up with their industry counterparts, agencies can set themselves up for success by using the best of the private-sector’s practices. One of these approaches that’s proving immensely valuable and is accelerating the value of new data initiatives is DataOps.
DataOps is modeled after DevOps, a set of practices that combines Agile development methodology and IT operations to shorten software development life cycles, increase feedback, and ultimately deliver better products through iteration. DataOps breaks the daunting data endeavor down into straightforward, manageable steps that produce real results.
The three essential components of DataOps
The first leg of the DataOps stool is process. Like DevOps, DataOps uses Agile methodology to create valuable new analytics for the organization. Agile accepts that people don’t necessarily know what they want until they see it, so new analytics are delivered on a short time frame and receive immediate feedback from users, which is then incorporated into the next version of analytics and so on. This tight feedback loop steers the development of new analytics to the features that are most valuable to the agency.
The second consideration is technology. Agencies must turn to both open source and commercial technologies that can address the complexity of the modern data supply chain. These components will need to be integrated into end-to-end solutions, but fortunately, they have been built to support the interoperability necessary to make them work together.
In terms of an infrastructure platform to support the data supply chain, there are great compute, storage and on-premise options. Now that many of the reasons agencies were initially skittish about cloud solutions are no longer valid, they should look to cloud-based solutions for flexibility and scalability.
Finally, people -- data suppliers, preparers and consumers – are an indispensable part of the DataOps mix. Underscoring the strategic importance of data, chief data officers -- responsible for lifecycle data management, coordination with officials to ensure that data needs are met, aligning data with best practices and other tasks -- are now mandated by law. Today, people with the right talents are already available in agencies.
Once the right technology, infrastructure and people are in place, another big focus is governance. Data governance tools are rapidly maturing, but interoperability between them and the rest of the data infrastructure is still a significant challenge. The DataOps team will need to bridge the gaps between these toolchains to help ensure compliance with regulations.
DataOps in the agency world
Today, several federal agencies are excelling at DataOps and reaping the benefits. Department of Homeland Security, for instance, is using DataOps to keep citizens of U.S. and its foreign allies safe in an environment where global security is increasingly a concern. The agency is reconciling passenger lists from a wide variety of sources to feed the Global Travel Assessment System – an open source, turn-key application that gives nation-states and border security entities the ability to collect, process, query and construct risk criteria against standardized air traveler information. The application provides the necessary capabilities to pre-screen travelers – ensuring terrorists and criminals are being identified when they travel. Beyond counterterrorism, GTAS capabilities could also extend into other areas, such as public health, by ensuring travelers from a particular region where a recent outbreak occurred receive health screenings.
When the Air Force needed to better analyze aircraft “flutter,” a phenomenon that causes wings and/or stabilizers to oscillate, it wanted to do so without conducting a large number of physical test flights. Using DataOps and key data unification technology, the Air Force was able to apply machine learning to understand a large corpus of past flight test and simulation data. Users can now quickly interrogate decades worth of technical data and reduce engineering process time dramatically by identifying relevant factors and technical predictions.
A final word
Establishing DataOps methodologies is critical for agencies moving forward. Success starts with taking an approach that combines processes, people and technology. Any one of these considerations handled in isolation is not going to yield the needed results. A holistic approach to DataOps, however, holds tremendous promise for agencies looking to make invaluable transformations through the strategic use of data.