Testing has long been considered the Achilles' heel of the military IT development process because it adds to the time and cost of getting new technology into users' hands.'There are a lot of program managers who are under the gun to get their program on the street. I think there are more systems out there that we probably should be testing,' said Air Force Col. Victoria A. Velez, commander of the Defense Information Systems Agency's Joint Interoperability Test Command in Fort Huachuca, Ariz.[IMGCAP(2)] JITC tests and certifies joint interoperability of command, control, communications, computers, intelligence, surveillance and reconnaissance systems.'The issue comes down to risk,' Velez said.'What the program managers and end customers are willing to risk for speed and the usability of the product or the program.'With most critical business systems, performance testing is mandatory. But with Web-based applications designed largely to process business interactions, project managers can move straight from development to fielding.But doing so is not necessarily a good idea, Velez said. Often program managers who forgo testing later find themselves forced to add costly capabilities to fix unanticipated problems, she said.'There is a general frustration with the amount of testing with some of the larger systems, but testing is important to make sure you're going to have a minimum amount of hiccups,' said Ray Bjorklund, vice president of market intelligence for Federal Sources Inc. of McLean, Va.Leo Hansen, project manager of JITC's Joint Network Enterprise Testbed Laboratory, said his organization has been working hard to fight the negative perception of testing that persists. JNET has a new philosophy: Be nicer, more helpful, and program managers will come. At the lab, tests can last anywhere from two to six weeks, he said.'Originally the testing philosophy was: Bring something in to test it, dump it on the developer and in the process, rap them out,' Hansen said. 'Because of that, a lot of people are reluctant to come.'[IMGCAP(3)] Now the goal is to make sure that a system passes a test before the lab finishes its work, Hansen said. 'We don't make money by knocking them down,' he said. 'We try to be as helpful as we can.'Without performance testing, project managers often find failures months and sometimes years after an application is developed and being integrated with other apps. These failures can get expensive, forcing developers to fix software glitches that could and should have been found during testing, Hansen said.'The Thrift Savings Plan is the perfect example,' he said. 'The problem with Web-based apps is it's fairly easy to develop apps, test them with one, two, three people without much load and then, when you go on the air, you find out things are grossly under capacity. Now you have to go back and retrofit it, and it costs a lot more to catch it later.'Software problems delayed the rollout of TSP's new record-keeping system. Although the glitches have been fixed, a Web interface caused additional complaints once the system came online in mid-June.DISA created the JNET lab in 1996 to support the Defense Department's paperless initiative. The lab is one of several dozen parts of JITC. It tests strategic, tactical, and communications systems, but JNET also tests how systems interoperate.[IMGCAP(4)] When it was created, JNET's only mission was to support military efforts for electronic data interchange. Today, the lab has 30 employees with expertise on topics ranging from security and Section 508 compliance to smart-card technology and protocols. JITC's primary testing mission is to support the needs of warfighters, Velez said.'We do development testing, independent validation and verification, operational testing, performance testing'just about any type of testing,' said Dan Bryce, lead contractor for JNET. A staff fielded by a joint venture between ManTech International Corp. of Fairfax, Va., and Electronic Warfare Associates Inc. of Herndon, Va., works with the government testers at the center.The lab also offers training, integration and help desk support, such as system setup and configuration.The first step in the testing process is for testers to talk to customers about their expectations, Bryce said.'What do they want, what are they really looking to get out of it?' he said. 'We like to talk to some of their users, too.'Some of the apps the lab is testing or recently tested include:
Col. Victoria A. Velez says developers who eschew testing in favor of a quick rollout are tempting fate.
In the Joint Tactical Radio System Test Facility, senior analyst Kevin Fitzpatrick checks the standards conformance of the Enhanced Position Location Reporting System.
Chris Richards
Col. Victoria A. Velez looks on as program analyst Carol McCracken tries out software that uses audio prompts to help blind users navigate the Web.
The center tests many products, such as this specialized keyboard, for compliance with Section 508 accessibility rules.
In the desert Southwest, a DOD team helps developers make sure their systems are ready for fieldingHere's why- Wide Area Work Flow, an e-commerce application intended to replace paper forms and invoices
- Transportation Global Edit Table, a database tool that the Defense Finance and Accounting Service uses to code accounting information
- Online Registration and Certification Application, an e-government app that establishes a database of information on businesses that work with the government, alleviating the need to re-enter the information with each transaction.