New York Times creates a dustup over data centers
Connecting state and local government leaders
An article that says the data centers at the foundation of Internet activity are environmentally damaging draws sharp criticism from others who cover the industry.
The New York Times created a dustup in the datacenter world with its investigative report on the impact on the environment of cloud technology and data centers – a critical element in the administration's efforts to save money and energy, and increase efficiency and security. The "yearlong examination by The New York Times has revealed that this foundation of the information industry is sharply at odds with its image of sleek efficiency and environmental friendliness," the article states.
The article, Power, Pollution and the Internet, received nearly universal criticism from IT insiders, calling out the article's generalizations about a growing industry that is as diverse as it is complex.
Rich Miller, editor at Data Center Knowledge, writes that the Times' first installment "does an artful, fact-laden job of telling half the story." He acknowledges that many data centers can be more efficient, but takes issue with the fact that the Times article doesn't mention how
companies like Google, Yahoo, Facebook and Microsoft have vastly improved the energy efficiency of their server farms by overhauling their power distribution systems, using fresh air instead of power-hungry chillers (“free cooling”) to cool their servers, and running their facilities at warmer temperatures. New design schemes for modular data centers have emerged, offering highly-efficient designs to customers with smaller operations than Google or Facebook. And we’re even seeing a growing focus on renewable energy, highlighted by Apple’s massive commitment to on-site solar energy and landfill-powered fuel cells.
Dan Woods, a Forbes contributor, is more pointed in his criticism, taking the Times to task for a "confused and incomplete article that is unworthy of its reputation." He writes:
The next problem is the concept of utilization itself. What would be a good utilization? The article never says. It just says that utilization is 7 to 12 percent. The unstated implication is that it should be a lot higher. But how higher? Should it be 100 percent? 75 percent? 50 percent? Knowing that number would be really excellent. The fact of the matter is that with very stable workloads it is possible to get high utilization and with variable workloads lower utilization would be expected, so you have room to handle spikes.
For more information, check Data Center Knowledge's roundup of coverage.