Virtualization increases the need for reality management in IT
Connecting state and local government leaders
Virtualization has reduced data center footprints, but little has been done to disentangle the duplication and complexity of information and applications sitting in those servers, observes Francis Hsu, an information executive working for the Homeland Security Department.
The use of the word management has been never been so widely used and abused. Put a prefix of just about anything before management and you get a whole new discipline: asset, data, expectation, financial, information, knowledge, risk, etc. Yet, when it comes to information technology, we never seem to get our arms around the whole problem. We seem content with partial solutions.
Let me offer a more holistic term: reality management. Reality management has special relevance to the information technology world because all too often people and organizations manage the technology instead of the information. Reality consists of the concrete and tangible as well as the abstract and intangible. The tangibles engage our senses: We can see, hear, smell, touch, taste and feel them. The intangibles engage our minds: we can create, evaluate and think about them.
But the tangibles and the intangibles cannot be managed the same way. There are good reasons behind the sayings “out of sight, out of mind” and “a place for everything and everything in its place.” These aphorisms explain human behavior: The concrete tangibles of what we can see, touch or sense gets our attention. The abstract intangibles do not.
The current crazes in server consolidation and virtualization, data center rationalization, wattage-per-square-meter reduction and so forth all point in this direction. We can see that a room full of server racks that handle the same workload once handled by a server farm, extending as far as the eye can see, appears to be an improvement in efficiency. Shrinking the hardware footprint directly reduces the number of data centers and their wattage consumption. This is clearly the low hanging fruit and quick-win everyone strives for.
Such exercises also dovetail nicely with how money is spent. When the technology budget-investment absorbs most of annual spending, everyone rightly focuses on that.
However, here’s the glaring paradox: Although virtualization reduces server, data center and wattage counts on the technology side of the business, it does little or nothing about managing the information residing therein.
Shrinking 10,000 servers in five data centers that use 1,000,000 watts into 1,000 rack-servers in one data center that uses 50,000 watts sounds really impressive. But shrinking the tangible footprint does nothing to disentangle the quantity, duplication, complexity, non-standardized, siloed information and applications sitting in those servers.
The virtualization paradox is subtle. The virtual by definition is not real. It is an illusion. Yet it misses entirely the purpose of the managing the information of IT.
When file systems, relational database systems and applications duplicate (but not exactly) the information they use, such abstract intangibles seldom get the attention they need. For example, perhaps over 100,000 applications exist in the U.S. federal government dealing with people: their names, addresses, phone numbers, e-mails, dates-of-birth, countries-of-birth, countries-of-citizenship, relationships to others, their status, etc. Every major cabinet department, including the State, Defense, Treasury, Justice, Commerce, Energy, Transportation, Veterans Affairs and Health and Human Services departments, and separate agencies such as the Securities and Exchange Commission, Federal Trade Commission and Federal Communications Commission have applications dealing with people. Often, such applications are mandated by law or regulation. Since America is such an immigrant–based country, identifying people by country-of-birth or citizenship is necessary. With only 192 countries in the United Nations, one would think sharing of country codes, for instance, should be easy. After all, we don’t have to own a code to use it.
Yet, in most large organizations, most applications are still being developed in server-centric mode: If it resides on a server, all the data is packaged with it. Applications owners always want total control of everything on their machines.
The reasons aren’t hard to understand. Before the Internet, when computers did not have to communicate with other computers and bandwidth was expensive, this was a given. Once the Internet spread and bandwidth prices dropped, the connections became cheap and easy, but connecting the applications and sharing the data proved enormously difficult. Except for the Internet Protocol (IP), connecting any two random applications created so much friction because each interface was unique to that connection. What IP brought together, the application interfaces tore asunder. The vendors were happy: every point of friction was another revenue stream to the bottom line. Complexity skyrocketed.
If we plot the cost-per-business function of adopting computers during the first 25-30 years of their use, the trend would track downward. Given how much revenues of the IT industry have jumped since the widespread use of the Internet, it would not be surprising to find that much of that leap was due to an increase in cost-per-business function, as a result of all that complexity.
It is time that both vendors and customers get real about managing IT: Managing the invisible and intangible, not just the visible and tangible, is where huge payoff will be.