DOD wants apps up to speed
Connecting state and local government leaders
Despite hardware advances, complex code and heavy traffic put a drag on systems.
Despite hardware advances, complex code and heavy traffic put a drag on systems Bloated operating systems and applications are preventing military organizations from getting sufficient speed from their information technology systems, according to several speakers at a recent Navy IT Day in Washington.
'We have achieved the promises of Moore's Law,' the decades-old axiom that processing power would roughly double every 18 to 24 months, said Chris Miller, the Navy's domain lead for command, control, communications, computers and intelligence (C4I).
'Much more pervasive now is the problem with software.'
'Software is getting bigger and more complex,' Miller said. 'The Windows Vista operating system is so much bigger than its predecessors, [but] it is not any faster, even though processing speeds have increased.'
Elizabeth Sedlacek, director of information systems and infrastructure at the Marine Corps Systems Command, echoed Miller's complaint. 'Windows 95 required 50M of hard drive space,' she said. 'Vista requires 15G.'
Part of the problem is that Moore's Law isn't the only one in the IT universe.
Sedlacek said increased resource requirements from the multiplication of software code illustrate an adaptation of Parkinson's Law: software will expand to fill the resources available to it. The original Parkinson's Law states that work would expand to fill the time available. A corollary to Parkinson's Law states that software eventually reaches a coefficient of inefficiency, meaning that it gets so large that it no longer processes data effectively.
Sedlacek summarized her conundrum by citing yet another law. 'Wirth's Law states that software gets faster slower than hardware gets faster,' she said. According to Wirth's law, then, software will always lag behind processing capacity.
But it wasn't always so. 'In the 1970s and 1980s, hardware processing power was wanting, and programmers had to code effectively and efficiently in order to get done what we needed to get done,' Sedlacek said. 'Now that capacity has increased and the software industry is much larger, developers want to put lots of features on software and to do it quickly in order to gain a competitive advantage. Efficiency of coding is no longer a priority.'
A problem the Marines face, for example, is that they rarely operate in a resource-rich environment.
Marines are on expedition-like missions when they deploy, Sedlacek said, and they typically operate with a minimal footprint in areas of limited bandwidth. They rely on small handheld devices for information and communications.
She challenged industry to help solve the problem.
Aside from software coding, agencies could address the problem through more efficient data management.
Miller suggested that the Navy needs a data strategy for how it expands applications. Richard Hull, chief scientist at Modus Operandi, agreed in an interview with GCN that getting smarter about collecting and processing data will help software work more efficiently.
'Software gets slower because the data operating over a network is increasing faster than computer processing rates,' Hull said.
Some satellites generate several gigabytes of data per second, Hull said. 'The next generation may be terabytes of information per second,' he said. 'If a computer has to deal with 100 times or 1,000 times the amount of data today than it did yesterday, it's going to be swamped.'
Hull suggested two strategies to cope with the glut of data. One involves prioritizing so that only the data most relevant to the mission is actually processed.
'A weather information system may have collected temperature once per hour, yielding 24 readings per day,' he said. 'Then a new technology comes along allowing you to collect a new temperature reading every second. That's 600 times more information than you had before.
But that doesn't mean you need to analyze it all in depth. You're really just interested in changes or anomalies.'
Using semantic architectures to analyze and filter data sets up hierarchies of data and processing that can help ensure that only the most interesting data climbs the ladder for in-depth analysis. 'You might have a network of 64 computers filtering the data and passing up relevant data to a level consisting of 16 computers and then to eight computers,' Hull said. 'This can filter out a lot of junk and provides a higher degree of fidelity in information collecting and analysis.'
Another possible solution is to use cloud computing schemes, he said. Cloud computing refers to the ability to construct ad hoc networks of computers that can share resources to tackle tough computing challenges.
An organization might have 10,000 computers at its disposal. Cloud computing provides a management structure by which, for example, 1,000 of those machines might be aggregated to solve a particular problem.
'It could take a year to build a network of
1,000 computers,' Hull said, 'but the cloud computing architecture allows this to be done quickly.'
Another potential solution comes in the form of muticore processing, essentially assigning pieces of the puzzle to different processors running simultaneously on a single device. There are limitations to this approach, as there are with cloud computing, because most applications are single-threaded, Sedlacek said. Muticore central processing units do not increase computing power when the applications can't be divvied up into discrete tasks.
The premise of multicore computing is that the computing capacity of microchips is leveling off and that the computing power inherent in existing machines must be maximized and optimized. Making that happen requires programmers to accomplish two things, said Joey Sevin, Navy programs manager at Mercury Federal Systems. They must develop a greater understanding of computer hardware, and they must do something about how they write software.
'It requires people to think differently about applications and how to write them,' Sevin said.
'Programmers are encouraged to throw off code quickly, but in the end this is very inefficient when the application is single-threaded.'
Sevin said the solution is to use middleware that can coordinate messaging among multiple processors. 'What needs to happen is the adoption of a standard' for a message passing interface, he said.
MPIs would allow existing computers to distribute tasks across their existing processors and boost their processing power. The effect of distributing computing assignments across multiple processors also has the effect of making the software less complex, Sevin said.
Mercury is working on developing multiprocessor solutions for processing sensor data.
Because data collection platforms are getting smaller and more complex, Mercury wants to pool processing power to support multiple missions.
'The idea is to create an environment adaptive to different situations,' Sevin said. An unmanned aerial vehicle 'may go out on a mission.
When it finishes its job and transmits its data, the computing asset may be reallocated to some other mission in another location and with a different type of sensor.'
This type of system is designed to handle two problems inherent in the collection and transmission of sensor data: latency and throughput.
Latency refers to the need for computing to function in real time. Throughput problems arise when the volume of data overwhelms processors and causes delays.
What sort of solution would the Marine Corps be most interested in? Sedlacek leaned toward simpler and leaner software. She urged industry to adopt open, modular and scalable software designs and to avoid 'featuritis.' She also suggested that the Marine Corps might develop incentives for lean software design, and she urged software developers to adopt the YAGNI principle: You Ain't Gonna Need It, so don't code it.
NEXT STORY: The return of Ada