FISMA priorities need to be realigned, House committee told
Connecting state and local government leaders
The government's key information assurance law misdirects scarce resources and does little to assure the security of agencies' legacy IT systems, an industry expert told a congressional committee today.<br>
The government's key information assurance law misdirects scarce resources and does little to assure the security of agencies' legacy IT systems, an industry expert told a congressional committee today.
The Federal Information Security Management Act 'runs the risk of becoming a paperwork exercise,' Kenneth Ammon, president of NetSec Inc. of Herndon, Va., told the House Government Reform Committee. 'If you look at the reporting that is being done under FISMA, there are virtually no objective measures of the agencies' real-world security posture.'
Ammon testified during a hearing on Internet security.
Another speaker also criticized the Internet's underlying protocols and suggested that .gov Web sites should not be hosted on government servers.
FISMA requires agencies to include security planning in budget proposals for new systems and programs, and to perform security audits, including certification and accreditation of existing IT systems. The Office of Management and Budget oversees the requirements. But security is being equated with timely completion of the C&A process, Ammon said.
'Unfortunately, C&A provides little value when applied to existing systems,' he said. 'Agencies are slavishly spending scarce resources to produce reports that merely state the obvious in page after page of gory detail. We need to stop wasting money on C&A reports, shortcut the paperwork process and spend our money effectively for pragmatic risk reduction.'
Ammon also showed examples of sensitive government personnel information accessed through the Google Internet search engine. That access could have been blocked with simple configuration changes in the applications supporting that information, but identifying needed changes would require detailed application testing.
'There seems to be little awareness of or interest in this kind of testing in the federal government,' Ammon said.
One way to help prevent such access would be to change the way government Web sites are hosted, said F. Thomas Leighton, chief scientist for Akamai Technologies Inc. of Cambridge, Mass.
'It could make sense to remove public-facing Web sites from government networks altogether,' Leighton said.
Akamai provides distributed content delivery services to a number of government agencies through a worldwide network of Web servers.
Karen Evans, OMB's administrator for e-government and IT, said outsourced Web hosting could be an option for agencies, depending on what online services are provided through the sites.
Leighton also said hardening government infrastructure would not provide complete IT security. 'You need to fix the Internet, securing the basic underlying protocols' such as the Border Gateway Protocol and the Domain Name Service, he said. Requests for changes in traffic routing under these protocols should be authenticated before being implemented, he said. There now is no authentication, making it possible for hackers to hijack traffic.
'Government has a role to play,' in making such improvements, Leighton said, primarily through funding security research and development.
NEXT STORY: Online Extra: How hackers work