Evolving options for on-premises data centers
Connecting state and local government leaders
Even as more operations move to the cloud, there are times when on-premises computing can be the right choice.
Ever since cloud computing became a realistic option for large operations, it has dominated the data center conversation. Efforts to consolidate data centers and cut the related operations and maintenance costs have pointed to cloud as a potential option. There are, however, instances when using on-premises computing can be beneficial and necessary. And the terminology in the space has so much overlap that agencies operating an on-prem site often call it a cloud.
So what, exactly, defines a data center? Cameron Chehreh, Dell Federal CTO, said the federal government will often call a couple of servers in a closet a data center, but that skews the more commercial definition: A centralized location that houses an organization's computing, storage and applications. David McClure, chief strategist at Veris Group, said a simple definition is a big building full of servers.
A cloud, meanwhile, might be described as “someone else’s servers.” There's significantly more to it than that, but it is a remote data center that is usually managed by someone other than the organization or agency storing its data there. This is the space filled by Amazon Web Services, Microsoft, Google and many others. “I think of cloud as locationless computing,” McClure said. It is on demand, as needed and priced according to use.
Blurring the lines further is private cloud, which can mean a few different things. ”If you put three cloud specialists in the room you may get twelve separate answers” on what a private cloud is, Chehreh said. Christian Heiter, CTO of engineering at Hitachi Data Systems Federal, agreed with the lack of agreement on what exactly private cloud is. “This is where the definition gets a little fuzzier -- it's a term of art,” he said.
The glib definition is that private cloud is just a data center with a fancy name to make it sound like whoever is building it is hip to the trends. McClure, however, said that isn’t necessarily accurate. For an on-premises data center to truly be considered a private cloud, it must implement modern technologies like virtualization and scalability, he said. A private cloud could also be a remote data center with a single tenant.
The Army, for example, has recently begun working with IBM to build a consolidated private cloud solution at Redstone Arsenal near Huntsville, Ala. IBM is building and will manage the facility, but it will be an on-premises facility devoted to the Army's needs. The first phase of the project aims to put 38 applications into the cloud and meet the Defense Information Systems Agency’s Impact Level 5 -- the highest security level for unclassified data.
McClure was working in the federal government during the Obama administration when the first push to cloud began. Federal officials, including then-federal CIO Vivek Kundra, recognized the opportunity to lower operations and maintenance costs by consolidating on-premises data centers and moving some operations to the cloud. Yet a decade later, “we’re still in the crawl-walk phase of cloud computing,” McClure said.
Efforts are being made to make the switch, though, and Dave Powner, director of information management and technology resources issues at Government Accountability Office, is a longstanding proponent of that effort. “It is really contingent on how well agencies can optimize what they have,” Powner said about the move to cloud.
In Powner's view, however, that optimization has been fairly limited. An Office of Management and Budget memo said the average government server is utilized at just 9-12 percent of its capacity. “That was really the impetus to start this data center consolidation effort,” he said. The goal is to get that number up to 65 percent.”
A 2014 GAO report on the government’s consolidation efforts estimated that the Treasury Department avoided more than $577 million in costs through consolidation between 2011 and 2013. Other agencies have also seen tens of millions in savings, the report said.
Some agencies have made big strides, he said. The Departments of Agriculture, Treasury and Justice; the General Services Administration; and NASA have all closed 50 percent of their data centers. The Defense Department has not yet hit 50 percent, but it has closed 700 facilities, Powner said.
“I think there is more of an acceptance that you can meet some of the security requirements through cloud offerings,” Powner said, attributing that shift to the examples set by early adopters, which have allowed others to see implementations that actually work.
Yet while cloud promises cost savings and flexibility, the future of a government without data centers is nowhere in sight. “I don’t necessarily see everything moving to cloud," said Sophia Vargas, an infrastructure analyst at Forrester. "I think it's kind of stuck -- potentially for a long time -- in more of a hybrid, multistate.”
That seems to be the consensus. At least for the foreseeable future, hybrid solutions will define the data center, and the inner workings of those data centers are changing accordingly.
How hybrid helps
Vargas’ colleague Richard Fichera, Forrester's vice president and principal analyst of infrastructure, said the simple definition of a hybrid data center is exactly what is sounds like: using both enterprise and cloud solutions for data storage. Using a local data center and one or more cloud services can provide a best-of-both-worlds scenario that can reduce cost and lead to the consolidation that Powner seeks.
“People are starting to find balance,” Chehreh said. “There is nothing but a bright future for hybrid moving forward.”
Gary Danoff, the senior vice president of cloud solutions at DLT Solutions, said hybrid makes sense right now because it helps agencies move to the cloud “in baby steps.” Agencies are starting with applications like websites that are easier to transfer to a cloud environment.
Websites are ideal for the cloud, as they are “more transient and dynamic in nature," Vargas said. "Anything that needs to support dynamic workloads” is a good fit too.
Most agencies are keeping more sensitive applications on premises for now. This is partly because the Federal Risk Authorization and Management Program, which sets standards for cloud security, has only begun to authorize cloud services at the higher security baselines. But some experts say the desire to stay on premises for security reasons is nothing more than a “peace of mind” consideration.
Heiter said that if two data centers -- one on-premises and one in the cloud -- are using the same hardware and security procedures, it would be hard to notice a difference. “In the end it comes down to trust,” he said. If an IT manager wants to be able to walk down the hall and ask about an application's status, then that’s something only an on-premises solution can offer.
Vargas said cloud can actually be more secure in some circumstances. The security algorithms used to monitor and keep systems safe become more efficient with more data to study, she noted, and no one has more data than the cloud providers.
There are other considerations that argue for keeping on-premises data centers -- the most simple being that agencies aren’t ready to dump their investments in equipment. “When they come up for maintenance maybe you’ll look at transitioning,” Heiter said.
Such debates will be different for every agency. And as IT managers hash out the appropriate hybrid strategy, they must also plan for the ways data center technology itself is changing.
Virtualization, hyperconvergence and software-defined everything
In the 1990s it was difficult to stack multiple applications on a single server, Fichera said. Entire companies popped up to tackle this issue and made a living tuning and tweaking systems to facilitate application stacks. “But it was tricky,” he said. For a long time industry pushed the idea of “one server, one application,” which caused server sprawl and underutilized infrastructure. “Virtualization was a godsend for that,” he said.
“With the advent of virtualization it has blurred the lines of what a physical device is going to be,” Heiter said. “Now it becomes a little bit less physical and a little more abstract.”
Virtualization allowed for more than one software-defined server to operate under a hypervisor, or the software that helps hardware share capacity for multiple applications. Software-defined data centers take advantage of virtualization and other technologies like consolidated monitoring tools and hyperconverged infrastructure.
Software-defined networking abstracts the physical switches and networking infrastructure. When capacity needs to scale, the structure can be easily modified, Heiter said. If you have a new protocol extension or a new security procedure, you can roll it out with a few keystrokes, he said, rather than going through a purchase order, waiting for the vendor and installing a solution in a rack.
Similarly, software-defined storage allows data center managers to see a single pool of physical, hardware-based storage that is allocated to various virtual machines on the fly. This allows managers to be more efficient in handling storage resources, Heiter said.
Software-defined data centers are also taking advantage of hyperconverged and converged infrastructure. “They’re the key for modernizing and taking advantage of the automation,” Chehreh said. Converged means bring networking and storage into a common device; and hyperconverged means adding software-defined, virtualized technology to that.
“The federal government really gravitated to that because it really allowed them to consolidate,” said Dan Fallon, the director of federal systems engineers at Nutanix, which specializes in hyperconverged solutions.
Putting compute, storage and networking into a single appliance has really helped with consolidation, Fallon said. Organizations can get applications up and running quicker because there are not multiple management silos. “We can power all of those workloads on the same platform,” he said.
The Pentagon took advantage of such efficiencies in a 2015 data center consolidation effort. The 12-month initiative replaced 1,000 servers that took up 60 racks with a Nutanix solution that needed only 10 racks of hardware.
The power of portability
Data centers don’t always have to be either in the cloud or in a building packed with servers, they can also be portable.
Modular data centers, or pods, are often the size of shipping containers and are usually deployed by data center operators to more rapidly deliver the services of a standard data center.
Microsoft started working on portability a few years ago, McClure said, but smaller agencies have started taking advantage of it.
“You don’t worry about building a data center; you move these things around to where they’re needed,” he said. “Mainly it was small agencies were looking for cloud environments that they wanted to be on-prem but didn’t want to build a data center, so pods became a reality.”
Fallon said his firm has been working to increase the mobility of the Navy’s Deployable Joint Command and Control program. A mobile cloud at the edge of a combat area, powered by portable hyperconverged hardware, can communicate with a traditional, fixed-site data center. These new solutions are about a quarter of the size of the system being replaced.
“It just shows the scalability, but also that you don’t have to change architecture regardless of the size,” he said.