What agencies can learn from Google's data center secrets
Connecting state and local government leaders
The search giant, which operates some of the world's largest data centers, reveals a unique approach to cooling and efficiency.
For years the federal government has been moving its data into bigger and bigger centers, consolidating servers in the name of efficiency and cost reduction. However, a recent report cast some doubt on whether large facilities were actually a model for efficiency, or simply a growing problem.
In the wake of that, Google, which maintains some of the largest data centers in the world, including many of those behind the often talked-about cloud, has pulled back the veil over how it cools and maintains its facilities. This had been a well-kept secret for a long time and is now revealed in stunning details and photos. It should give feds plenty to think about in terms of how to better maintain and design their own centers.
What Google has showed the world is unique in several ways. The server rooms themselves are rather austere in design, with almost everything being sacrificed in the name of efficiency. Google also is using water cooling more than probably anyone else in the world and is not afraid to put running water pipes directly over top of electronic equipment.
Specifically, what the company has done is turn the entire main room of its data centers into what is typically the cool isle of the equation. Individual racks and corridors within that room become the hot isle. That’s unique in and of itself, but it wouldn’t work without some serious engineering.
Air at about 80 degrees enters the racks from the bottom and blows across the CPUs. Fans then direct the air upwards to the top of each enclosure. One secret is that the servers themselves resemble stripped-down cars in a lot of ways. They are really just open frames which hold the computing parts in place. There are no sides or cosmetic additions, which are both unnecessary and hinder air flow.
Even the area around the racks is designed to be inexpensive and flexible, with plastic curtains corralling and directing the air instead of solid walls. The curtains work just fine to contain airflow, can be reconfigured easily if needed and are cheap to buy and maintain.
Once the hot air that has been heated by the servers to about 120 degrees reaches the top of the rack, it flows into a series of stainless steel pipes filled with cold running water. That brings the air back down to room temperature before expelling it into the main room. Then that air, which is controlled for humidity and filtered, is recycled back into the racks from the bottom.
The water gets piped to cooling towers to bleed off its heat before returning back into the server room to start its long journey through the data center once again. Some of the work rooms at Google data centers look a bit like the control rooms for a water park. There are as many valves as electric wires in some of the photos.
Google plans to publish several blog posts explaining individual aspects of the data center cooling operations in more detail. Even now, it’s probably given federal data center managers and planners a lot to think about in terms of efficiency, the use of water cooling in close proximity to electronic equipment, what components are truly necessary, and what can be eliminated.
NEXT STORY: Can the cloud withstand a 'perfect storm'?