Why government's next-gen data centers are looking a lot like Google's
Connecting state and local government leaders
Agencies are starting to catch on to the idea that software, not hardware, is the key to efficient storage.
Stop me if you've heard this one: The federal government, pushed to the brink by budget pressures, turns to "smarter" IT practices pioneered in the private sector.
It's a script so formulaic, you'd suspect it came from Hollywood:
INT. THE WHITE HOUSE – NIGHT
Steven VanRoekel sits at his desk, poring through federal data center cost and budget reports. He bangs his fist on the table in frustration.
VANROEKEL (to himself): “It’s just not possible! We’ll never be able to scale this way.”
As if on cue, the cell phone in his pocket springs to life with a mysterious ringtone.
LARRY PAGE: “Steven? It’s Page. You and I are about to save Washington. Meet me at the Google-Copter.”
Now perhaps this isn’t literally the way it happened. But somehow, a number of federal bigwigs caught wind of the fact that Google has a secret for making enormous data centers economically viable.
And it’s not just Google: Amazon, Facebook and the other Internet giants have known for years that it’s software, not hardware, that’s key to building more efficient, flexible and cost-effective data centers.
In particular, government stands to benefit from the software-centric approach to data storage, owing to the upsurge in public information being generated and stored in federal data centers and the strain this places on IT budgeting and scalability.
But what exactly is the software-centric model capable of, how does it operate and why has government taken so long to put it into practice?
Scene 1: Google lights the way
Many years ago, frustrated by expensive proprietary hardware and the costs of integrating disparate systems and replacing outdated ones, Google had an idea:
What if we could build a data center out of nothing but commodity servers -- purchased in bulk (on the cheap) and ready to be swapped out, clustered together and controlled by intelligent, policy-driven software? Better yet, with brainy software at the helm, storage and compute could be converged into a single, hyper-efficient tier so that the pain of managing storage separately just goes away.
It all sounds simple in theory. In practice, however, it amounts to something quite difficult: a very large, very scaled-out cluster.
Still, this is precisely what Google built — with as many as 15,000 nodes in a single data center, all operating as one shared resource and all controlled by ingenious software to keep the system unified, efficient, highly available and resilient.
For Google, this meant enormous cost savings, both on capital expenditures when scaling out and on operating expenditures when maintaining the system.
But talking about this model as “advantageous” for Google misses the point. With enormous quantities (petabytes upon petabytes) of data to store and process, there was simply no viable alternative for Page’s company to pursue.
Put another way, the software-centric model has fast become an operational necessity for any organization storing massive, rapidly increasing amounts of data. Organizations such as Google and — as it happens — the federal government.
Scene 2: Government shrugs
So there’s a data center model that seems tailor-made for government that’s been enabling the cost-effective operations of big public Internet companies for years and that’s relatively inexpensive.
What, exactly, is the problem?
Well, for starters, the software-centric concept is disruptive in the government space, which means that storage and server companies have a lot of ground to defend and, incidentally, a lot of assets and influence with which to defend it.
Furthermore, even if government agencies could succeed in swapping out millions of dollars of proprietary hardware with commoditized x86-based servers, they would still need the software to control it — some very specialized, government-compatible software.
And here’s where we get to the root of the problem: Google devised its system for Google, not for government.
In attempting to replicate Google’s model, government IT professionals would have to solve two additional problems that Google never considered: 1) how to accommodate government’s voracious appetite for VMware-driven virtualization, and 2) how to deliver a standardized “plug and play” installation for hundreds of government entities with their own systems and applications.
The good news is that new, successful companies have created powerful, cost-effective solutions to these very issues — sometimes referred to as “converged infrastructure” challenges.
So the solutions do exist. But when you combine significant pressure from traditional storage and server vendors with government’s typical inertia, the software-centric model needs one additional catalyst: pain. For federal bigwigs to make the switch, they need to be feeling a whole lot of it.
Scene 3: The agony and the ecstasy
Typically, government data centers have a three-tiered architecture, with server and storage hardware connected by a dedicated network — either a storage-area network (SAN) or a network-attached storage (NAS) system.
This is a sensible approach, because it centralizes storage resources and reduces the inefficiencies and rigidity of directly attached storage (DAS) boxes.
But something bad happens to networked storage when data centers grow and virtualization expands: the SAN/NAS starts sucking up more and more time, energy and IT resources.
Eventually, when the data centers grow large enough, the costs, delays and inefficiencies take hold, and everyone starts to feel a great deal of pain.
That’s the problem in government today: Data centers are larger than they’ve ever been, and growing at a terrifying rate, thanks to the explosion of big data.
In fact, the massive growth of public information generated and stored in government data centers combined with the storage demands of virtualized computing have strained scalability to the absolute breaking point.
Thus, it’s no wonder that government is finally beginning to adopt the Google model: commoditized DAS storage that’s coordinated and controlled by intelligent software. In other words, the software-centric approach to storage.
After all, this isn’t an unproven, bleeding-edge concept; it’s simply bringing the hyper-efficiency of Google to the government back-office.
And when government can scale the way Google can scale, there’s your Hollywood ending.