Can software quality tools shrink technical debt?
Connecting state and local government leaders
By experimenting with new standards for software quality, building those standards into contracts and deploying code scanning tools that spot potential problems, IT and software managers can predict and reduce the future costs of enterprise systems.
The federal government spends more than $80 billion annually on IT with, according to some estimates, over 70 percent going to maintaining legacy systems, leaving only a fraction for funding next-generation agency systems.
The formula keeps older government IT legacy systems running, but it also exposes them to potential disaster as the costs of maintaining older systems inevitably mount, and budgets for new investments are squeezed. The resulting zero-sum game tends to disadvantage new systems aimed at the latest priorities and requirements.
Speaking at a hearing before the House Committee on Oversight and Government Reform Subcommittee on IT in September, Mark Ryland, Amazon Web Service’s chief architect for worldwide public sector, warned lawmakers of the risks they were taking on in funding older systems.
The effort to balance the funding of older legacy systems against the urgencies of new priorities is part of a discussion IT policymakers are having over how to handle “technical debt.” The term can be defined as the future cost of going back and fixing problems that were created when software was first developed.
“If I build a piece of software and I’ve made some mistakes [but] don’t have time to fix them now… I know I have to fix them in the future,” said Bill Curtis, executive director of the Consortium for IT Software Quality (CISQ). “I’ve created a debt, and we’re going to have to pay that debt back sometime by spending time fixing those problems in the future.”
The failure to attend to technical debt has mixed consequences, from slowing down end-user services to setting up potential conflicts among developers.
“As it gets worse, customers complain about slow delivery, increasing the pressure to take more short cuts, which increases the technical debt, which slows the delivery process, which increases customer dissatisfaction, in a rapidly spiraling vicious cycle,” said IT analyst Jim Highsmith in a recent report by GSA’s 18F digital services organization.
In addressing technical debt, IT and software managers lately have begun to experiment with new tactics, such as introducing new standards for software quality, using those standards in contracts and service agreements and improving code scanning tools to spot instances when best quality practices are not being followed.
“Today software standards makers want to use measures they’ve created for software structural quality -- such as performance efficiency, maintainability and reliability and security -- as a foundation for estimating the amount of technical debt in a system,” said Curtis. “If we can analyze the code, look for violations of good architectural and coding practices in the code and [determine] the amount of effort required to fix those, then we can use that as an estimate of the amount of technical debt in the system.”
Curtis is also a senior vice president and chief scientist at CAST, developer of the CAST software application that scans code for defects, including those related to language and architectural best practices and security vulnerabilities.
The CAST tool is also designed for analyzing and reporting measures of code under development, including broader levels of complexity in programming projects.
Government IT consultancy Booz Allen Hamilton, which on any given day has 300 different software development projects under way for mostly government clients, uses CAST as a tool for measuring systemic coding practices across its coding teams.
“We really procured it as a process improvement tool,” said Dan Tucker, a principal at Booz Allen Hamilton who is a leader in the firm’s systems delivery group. “CAST ... gave us that architectural view – so this is how complex your system is, how complex it’s been architected, where there might be risk in how the system is coded.”
CAST presents levels of analysis that many open source tools don’t, Tucker said, including the ability to identify particularly complex sections of code, showing which modules have the most logic is running through them and which have generated the most comments from coders.
While the firm has used the tool to analyze its internal projects, it has also received queries from government agencies, including the Air Force and Navy, which were interested in using CAST to help gauge the impact of the addition of a significant number of new users on several legacy systems.
The CAST report found that the Navy system was well developed in the security arena, but its architectural design revealed “brittleness that could result in some latency if it was scaled up.”
The CAST tool’s capability for analyzing code at the system level may also give it potential uses in gauging service-level-agreement-type proposals, according to Tucker, who recalled a request for proposals that was based on having responders meet CAST-based service levels.
“We saw one RFP that said, ‘your first order of business will be to make sure it stays at least with this level,’” he said. “It was almost like a service-level agreement based on their structural code quality score coming out of CAST.”
Curtis said CISQ aims to submit a standard for technical debt at an upcoming meeting of the Object Management Group standards consortium that would encompass structural quality of code, or “how well the system is built,” he said. “We’re only concerned with the violations of good quality -- that you know you have to fix because it creates debt -- and you’re going to have to pay for in the future.”
“Technical debt is a huge cost, Curtis said. “If government really wants to improve its efficiency and improve costs without sacrificing service, it’s going to be determined by the quality of these systems. And removing technical debt will help make that cheaper and more efficient.”
Editor's note: This article was changed Dec. 29 to clarify Dan Tucker's role at Booz Allen Hamilton.
NEXT STORY: 3 trends for government data analytics