Why DevOps is good for government
Connecting state and local government leaders
When developers and operations staff work closely to continuously deploy and monitor applications they reduce the waste from hand offs and build quality – key motivators for agile and lean methodology.
Terms like service-oriented architecture and big data have long been part of the pantheon of buzzwords that have captivated us while eluding any kind of real understanding. Recently joining them is DevOps. In one form or another, DevOps is a growing force in the commercial space, so it is important for government IT managers to understand its underlying principles.
Similar to Brangelina and Kimye, DevOps is a portmanteau of “Developers” and “Operations.” Just as with other buzzwords, there isn’t an accepted definition of DevOps, so I can only offer my perspective after working in one DevOps environment and studying others. I would say the foundation of DevOps is the application of lean and agile principles to the entire lifecycle of an application – rather than merely developing an application with Scrum or Kanban and then tossing it to operations never to see it again.
In a manner consistent with the agile principle of cross-functional teams, developers and operations staff work closely together, often interchangeably, to continuously deploy and monitor applications as they evolve from a few lines of code to production systems to legacy systems. This increases communication, reduces the waste from handoffs and builds quality – key motivators for agile and lean methodology.
DevOps brings more and more value to the commercial space every day. It can do the same for the government space.
DevOps tools
A fundamental practice among agile development teams is automation. Developers use tools like Gradle or Rake along with continuous integration tools like Jenkins or Hudson to automate their builds, tests, documentation, and deployments to test servers.
In a DevOps environment, the operations staff also relies on tools to automate tasks like deployments to production and monitoring. Configuration automation tools such as Puppet and Chef allow operations staff to write scripts to perform setup and deployment tasks in a specified order on physical servers, virtual machines or containers. Because these tools use code as an abstraction for infrastructure configuration, operations can apply practices like configuration management, testing, and coding standards to Puppet scripts and Chef “cookbooks” just as developers do to application code.
Docker is surging in popularity among container implementations. Containers are similar to virtual machines but abstract only the operating system kernel instead of the entire device. This means multiple isolated containers can share resources on a single host, which is much more efficient and lightweight than running multiple virtual machines on a single host. Operations can use Chef or Puppet to configure a Docker container with applications and their dependencies, and multiple containers can be packaged together to run on a single machine in isolation. This process can be automated further with Vagrant to really simplify moving applications across environments. This capability could come in handy, for example, in the defense space where multiple applications have to be packaged together to deploy to a classified environment and ultimately to theater.
In a DevOps shop, developer automation and operations automation are often blended together like with this Jenkins plugin for Chef. Developers also can build their applications with a technology stack with DevOps in mind. For example, they can build a REST API layer with Dropwizard, which has a built-in health check for deadlocked threads and allows developers to add custom health checks for things like service availability and database connectivity.
All of these tools can be mixed and matched, and there is some overlap among them. Just don’t let tools drive your mission; they only exist to support the DevOps mindset.
DevOps and the cloud
Maintaining physical machines and installations of Git, Jenkins and other project automation tools is probably not your core business. The cloud allows your staff to outsource these concerns and focus on the fun stuff.
The cloud is orthogonal to the DevOps concept, but it is fairly straightforward to see why they complement each other. Instead of locally installed Git and Jenkins, developers can use PaaS equivalents like GitLab and CloudBees for their code. So too can operations staff for their Puppet, Chef and/or Vagrant files. Meanwhile Amazon, Google and Microsoft have embraced Docker, so deploying containers to the cloud is relatively seamless.
A reasonable workflow for a DevOps staff would be to use Vagrant and Chef and/or Puppet to automate application deployments to test servers (virtual or physical) and then to use these tools with Docker to configure the production container with the same applications and deploy it to the cloud. By automating as much of the process as possible, DevOps staff can minimize errors and make the deployment process repeatable across environments.
DevOps is a movement that has gained traction for good reason, but like its agile and lean forefathers, it demands a keen understanding of practices and the tools to support them. If you do it right, you may never hear “But it works on my machine!” again.
NEXT STORY: Power BI expands access to data sources