Connecting state and local government leaders
Future of IT: Cloud computing heralds winds of change, but heavy-duty OSes hold their ground.
Last year, when all the analysts were still heady about the release of Microsoft Vista, some predicted that it would be the last desktop operating system that Microsoft would ever release.
After all, we're in the age of the Internet now. Operating systems are so 20th century.
In the future, we will all get our computing from the cloud, the latest buzz phrase for getting computing via the network. Credit Suisse research analyst Jason Maynard predicted in a report that 'Vista is the last of the Big Bang operating system releases from Microsoft.'
And yet, systems administrators and users started to slowly see that, like every other operating system, Microsoft Vista isn't the be-all and very likely not the end-all. First, Microsoft announced that it would be shipping a service pack for Windows Vista in the first half of 2008, to take care of some inconsistencies.
Then the company started informing its sales force of a follow-up OS that Microsoft was working on, to be released around the turn of the decade, the working name being Windows 7.
'I think OSes are going to evolve from what we traditionally know them as today, but a lot of the same type functions are going to have to be there,' said Pat Arnold, chief technical officer at the federal division of Microsoft. 'You still worry about identity, security, input/output, memory, trust ' all sorts of things.
'Some vendors like to talk about the death of the OS, with virtualization causing the end.
But I like to quote Mark Twain ' reports of our demise have been greatly exaggerated,' said Andrew Cathrow, product marketing manager at Red Hat, which offers the Red Hat Enterprise Linux operating system.
Nonetheless, changes are afoot. 'I think we'll see the evolution of the OS.'
Thin is in
Operating systems are one of the most fundamental tools in the field of computational science.
At its most simple, an operating system is the interface between the hardware ' the computer itself ' and the software the user runs. Want to print a document? The word processor doesn't know where the printer is, so it sends the print command to the OS, which keeps the details of where the printer is, and which formats it will accept for printing material.
These days, however, our relationships with our computers are anything but simple. Thanks to the Internet, we may not need to rely as heavily on applications on our own computers to do our work.
A new breed of applications can be run on any OS, using only the browser.
Amazon offers several stores' worth of goods, accessible only through a browser. Google offers email service. Second Life offers an entire virtual world. With Web applications, the systems administrator doesn't need to worry about installing or maintaining the application.
All that is done by the company offering the online application or service.
Web applications also follow this line of thinking. 'Over time, you will think less in terms of hardware and more in terms of service levels. It's all about applications,' Cathrow said.
If applications are no longer dependent on the OS, then do we need a heavy OS, such as Microsoft Windows or Linux on a PC? Do we even need all the functionality of a full-fledged desktop or laptop PC? Citrix, for instance, offers a range of software that will let users execute Microsoft Windows programs via the network. By using a Citrix gateway, the user doesn't need the fastest computer to run the latest application written for Microsoft Windows. Users save money, space and energy by accessing the application from a centralized server.
'Today, we live in a world that is application- centric,' said Martin Duursma, chairman of the chief technology office at Citrix. 'As far as the user is concerned, they aren't going to worry about where the application is running, if it is running under PowerPoint or on the Web.' Duursma said users might move from thinking about computers to thinking about states, such as a state for all your workplace documents, accessible from a computer anywhere and kept at a data center. 'You have a thin-client device where you don't have any ownership of the local device.'
'I talk to my children about operating systems, they look at me strange and say, 'why would I care?' ' Duursma said. 'They just want to get to their applications.'
Sun Microsystems offers a line of inexpensive thin clients under the same rationale.
With Sun Ray thin clients, servers offer up the programs requested by the user ' the client computer pretty much acts like a terminal.
Such an approach needs reliable, high-throughput networks to transport application interfaces and data to and from the data center and the client. 'We're going to see things move more towards virtualizing the desktop as the network builds out, and centralizing back at the data center,' said Bill Vass, president of Sun Microsystems Federal.
OSes are going on a diet in the server room, thanks to increased use of an old technology known as virtualization. Virtualization has grown in popularity because of its ability to consolidate servers. Say you have two applications that require dedicated servers. Each requires a particular release of an operating system, though neither takes up more than a handful of resources of that server. Virtualization can run both of those applications, along with their supporting OSes, on a single box, saving costs on hardware and power.
'End users don't interact with operating systems; they interact with applications and the application is what drives the business,' wrote Shayne Higdon, director of corporate development at Quest Software, in an e-mail.
BEA for instance, recently released a product called Liquid VM, which allows machines to run Java programs over a VMware ESX Server hypervisor ' without an OS at all. 'I don't know if BEA/VMware customers have adopted this, but it is the first time we've seen an application vendor pushing away from the OS vendors,' Higdon said.
Because virtualization is getting easier, we are starting to see independent software vendors start to package their applications with a dedicated OS. When you buy the software, you actually get the software and a complete operating environment. This practice allows vendors to control the environment, which cuts the cost of supporting different OSes. 'So you don't patch the application, you patch the entire unit together,' Cathrow said.
The kind of OSes these companies need is slim ' only incorporating the functions needed by that particular application. To accommodate these software vendors, Red Hat has developed a stripped-down version of RHEL, called the Appliance Operating System.
Fat holds out
The interesting thing about thin clients is that, even though they seem to be a good idea, implementations have thus far have been few.
For some reason, sales of desktop and laptop PCs remain strong.
'Cloud will never subsume all of the client.
Bandwidth is never as good as we'd like it to be. The creativity of the software community places as much demand on bandwidth as compute power,' said Jim Held, director of terascale computing at Intel. 'Keeping proper balance between client and servers is something we will continue to do.'
'There are certain kinds of uses that will benefit from increased bandwidth and better connectivity,' Held said. 'On the other hand, there are tasks that scale much better when you implement them on a client.'
Plus, hardware advancements put new demands on the OS. Consider multicore chips.
Tomorrow's processors will have eight, 16, 32 or even more cores. It will be the OS' job to figure out how to best use these cores, Held said. The OS will have to assign threads to cores, balancing the work among all the cores for greater efficiency.
Mike Kemp, CTO at Liquid Computing, noted that scheduling itself will take on more importance in the years to come. We may see OSes with specialized schedulers ' one could be tweaked for real-time applications, for instance, while another may work better in a clustered environment.
On the server side, Vass noted that there is a trend among server OS providers to bundle in more features.
Sun's Solaris, for instance, now comes with a database, PostGresQL, and the company's GlassFish application server and the Light- weight Directory Access Protocol. Red Hat recently purchased JBoss, a company that maintains an open-source application server, which has been incorporated into Red Hat's own OS stack.
With these companies, it appears as if the OS is growing in size and complexity.
'There's always this pendulum between personalized computing and centralized computing,' Arnold said. 'We see that as being driven by progress in both computation and communication.' He added that the platforms of the future may be more mobile. And while these will have connectivity, they will also do 'lots of local processing,' using material generated by local sensors, for instances. Imagine, for instance, your cell phone doing real-time weather reports based on information generated by nearby weather sensors, along with data feeds from the networks.
In many ways, this balance between the network and internal computation is nothing new. When Donald Becker, founder and chief scientist of Scyld Software and co-developer of the Beowulf cluster software, reflected on the question about the next 25 years of the operating system, he was a bit surprised to realize that not much has happened in the past 25 years. He points out that if you put a 1979 Unix systems administrator in a time machine and bring him to the present day and plant him in front of a new Linux server, he would feel at home. 'He'd have to learn how some things have changed, but it wouldn't be so alien that they wouldn't need to know what to do.'
It's not that the Unix model hasn't changed, but the basic goals of the OS remains the largely same: Make the devices attached to the computer ' the screen, the printer, the network card ' work as smoothly as possible.
No matter how many layers of abstraction you get, some piece of software must deal with the physical inputs and outputs. And if you want to get the best performance from the system, you need device drivers that take full advantage of the hardware. And those drivers need instructions to interact with a specific OS.
'And just when we think we've covered every possible new device, something else comes along,' Becker said. Yes, OSes will be with us for a long time.
Microsoft's plan for future computing
If the technical visionaries are correct, the future of computing will be a mix of applications that reside on the desktop and services obtained via the Internet, or the compute cloud as it is sometimes called. Microsoft is already configuring many of its chief development and runtime products to optimize this process of seamlessly drawing value from both sources.
In October, Microsoft announced its own long-range plans in an initiative, called Oslo, which will accommodate such a future.
'Oslo is a set of technologies that will enable people to design, build and deploy service-oriented applications,' said Dustin Sell, a technology specialist in the federal office at Microsoft.
Microsoft will re-engineer a number of its products, such as Visual Studio, BizTalk Server and the .NET framework, so they will work more seamlessly with one another under the Oslo configuration.
The company will also introduce a new, as-yet-unnamed modeling language that will help architects define how services will work together. 'We want to make modeling a first-class citizen in application development,' Sell said. Architects and domain experts can use this modeling tool to sketch how a business process will work. When they are finished, the tool will generate a skeleton for the program, allowing coders to merely fill in the details, said Ken Knueven, federal programs manager at Microsoft.
'People can either compose the application from existing Web services or perhaps identify things that need to be exposed as a Web service,' Knueven said.
'From our standpoint, we're looking at it from both software and services. You may be leveraging some of the existing software from the desktop and the server, and you may be leveraging some services form across the cloud, either Microsoft's or others'.'
Oslo is in its early stages, but during the next few years, we will start to see Oslo-related products, Sell said.
NEXT STORY: Microsoft's Oslo initiative