Next up: Integration of virtual machines, containers
Connecting state and local government leaders
Agencies can meet increased performance demands by combining cloud-based virtual machines and containers.
As budgets shrink, government agencies are being called to do more with less. Data breaches and security threats put even more pressure on government IT departments, which are already facing the prospect of making costly upgrades to their legacy systems. Many agencies simply do not have the resources they need to transform their IT systems to enable them to meet the increasing demands for security, agility and performance.
Because they are lightweight and flexible, containers will represent the third wave of computing. Requiring fewer investments in hardware and software, containers enable faster deployment and remove what is often referred to as “dependency hell,” where one application needs something else installed first to make it work. Container technology is being considered an ideal tool to help government agencies meet the mounting demands of doing more with less for the following reasons:
Agility. By giving government IT teams a way to build an application once, along with a system to port that application across agencies, containers offer the ability to share code in a consistent and predictable way. Using this open-source environment, IT developers can work within containers to easily tailor or modify the code to match their individual specific security and performance requirements. Containers allow these applications to be built directly and integrated through one standardized interface.
Scalability. Containers’ microservices architecture allows them to scale up and out much more easily than virtual machines. With the help of recent orchestration tools like Docker Swarm, Kubernetes, Kontena and others, multiple copies of a container can coexist.
Security. Even with the cost and agility benefits containers bring to the table, security is one essential that government agencies cannot afford to ignore. Containers simplify and streamline iterative upgrades and quality improvements. Security and software updates are instantly deployed to all shared users, making it possible for applications to perform as expected.
Hybrid is the new norm
Containers were initially viewed as replacing virtual machines; however, the drive for security and scalability is leading to an integration of VMs and containers.
As Forbes recently reported, “What we are seeing is the convergence of virtualization and containerization to deliver a secure and scalable computer infrastructure.”
By combining VMs with containers, agencies can expect a more stable and secure environment, while still enjoying the flexibility of containers. More important, hybrid containerization requires agencies to spend less on memory and hardware, while speeding deployment, lessening IT workload and significantly reducing duplicate application development. All of this leads to the substantial cost savings agencies are eager to find.
However, while hybrid containerization does provide savings, scalability and security, it also comes with complexities and hurdles that should be addressed with robust testing.
Performance estimates
Given the number of variables to be considered with container hybridization -- including software, security, firewalls and multiple infrastructure configurations -- how can developers ensure the application’s optimum performance in the real world?
Specifications can only reveal so much -- storage, bandwidth, network connectivity, the number of CPUs and servers. Yet, these data points do not provide an accurate assessment of the application’s performance in real time. With so much at stake, developers cannot go live with “back of the envelope calculations.” Slow response times, outages and security breaches are significant risks that will quickly diminish any perceived savings in costs and service improvements.
Developers must have a testing system in place that can help them accurately predict if an application can scale to meet the growing requirements of data storage and transport. The questions developers should be asking, before going live, revolve around scalability and performance:
- At what limit does the system fail?
- How quickly does the system recover?
- Do we have the resources to respond to increasing demand?
Load testing: Only part of the picture
Although load-testing allows the testing of individual components, it does not clearly indicate how the system will perform in the real world with a variety of profiles -- or even if the system can sustain real-world application loads.
End-to-end testing, however, is designed to reflect real or anticipated, quantities of CPU, storage, LAN and WAN I/O bandwidth. From start to finish, this test reveals whether the flow of an application works as expected and pinpoints exactly where the system slows as the load increases. Developers can easily identify system dependencies, monitor performance and ensure that the integrity of the data is maintained at various load levels.
A programmable virtual machine can be configured to manage the specific framework and requirements of precise real-world scenarios. This type of monitoring allows developers to realistically predict when the system will reach its limits by:
- Populating test-load across the cloud data center.
- Measuring performance and resource utilization.
- Aggregating the results into an actionable format.
Testing during continuous integration/development
Without the ability to measure and test an application's end-to-end performance, IT departments may find themselves scrambling to fix catastrophic problems in real time. Testing should not only minimize risk but also enhance performance and spur improvement.
Container applications are likely to have been built on a machine with customized settings and codes, which will vary from the actual machines being used in the live environment. Any test should be designed with this in mind, and developers should look for factors that could affect the performance of the application, regardless of the end machine being used.
One advantage of containers is their ability to simulate performance, allowing developers to run functionality and performance tests during the development cycle, which allows them to catch bugs while there is still time to make adjustments.
By isolating applications during the development stages, along with everything needed to run independently -- including application files, runtime environments, dependent libraries and configurations -- developers can efficiently and accurately identify complications and fine-tune them before moving into production. More importantly, IT developers are able to unearth performance problems to adjust the components, without disrupting other apps or other processes within the container.
What's in it for you?
With its unified, collaborative infrastructure, the hybridization of VMs and containers allow government organizations to concentrate on business results, service and costs savings. This allows them to centralize, automate and improve service. The combination of both cloud-based systems of VMs and containers brings embedded security and performance enhancements, but can be quite complex to put into use. The process of smoothly creating, transitioning and deploying these applications seamlessly throughout government systems is a serious challenge that leaves no room for delays or performance and security disasters.
By closely following prescribed end-to-end test procedures, agencies can ensure their applications will meet performance requirements for an evolving workload as demands increase.
NEXT STORY: NASA launches searchable multimedia database