Connecting state and local government leaders
The commotion over allegations that the National Security Agency has been secretly compiling data on millions of telephone calls made by ordinary citizens raises an interesting question: With the technologies in place today, how well can NSA actually mine the information it gathers?
The commotion over allegations that the National Security Agency has been secretly compiling data on millions of telephone calls made by ordinary citizens raises an interesting question: With the technologies in place today, how well can NSA actually mine the information it gathers?
There is no public information on the computers and software NSA is using, and the spy agency isn't about to discuss them. But there are companies whose products are used to handle very large databases, and the challenges NSA would face in cross-referencing the information and looking for connections are well known.
Calling on AT&T
According to published reports, AT&T Corp. provided access to 'Daytona,' a database management technology that it uses to manage its call detail record (CDR) database. 'As of September 2005, all of the CDR data managed by Daytona, when uncompressed, totaled more than 312 terabytes,' according to the Electronic Frontier Foundation, which filed a class action lawsuit against AT&T in January.
If this figure is accurate, NSA's database could be about 900TB, assuming Verizon Communications Inc. of New York and BellSouth Corp. of Atlanta, the other companies alleged to have provided information to NSA, have CDR databases of similar size.
The technology needed for data mining can be broken roughly into three components: storage, computing power and analytical software.
Data mining'looking for patterns in all that information that could reveal terrorist 'sleepers' within the United States, for instance'requires a computer to have real-time access to as much of the entire database as possible.
'Storage is a combination of online accessible storage and offline multimedia storage,' said Robert David Steele, chief executive officer of OSS.net and a former CIA employee, who now champions the use of open-source information for intelligence purposes.
'My impression'strictly a professional guess'is that at least 75 percent of what NSA 'knows' is ' offline and not accessible. ' You cannot do good pattern analysis, including historical comparisons, without massive online storage.'
SGI has been pioneering ways to increase the amount of RAM available to its computers. While desktop computers have, on average, 512M to 1G of RAM, SGI has configured systems with terabytes' worth of active memory.
'You want to put a tremendous amount of data in the RAM memory of the computer at any one time, so you can cross-reference very quickly,' said Bill Mannel, director of systems marketing for SGI. 'If the [database] is stored on disk, the disk access itself is too long.'
To date, though, the largest RAM-configured computer SGI has shipped is 13TB, Mannel said. That could handle less than 1.5 percent of the three combined CDR databases.
In addition, there are limits on how much RAM a computer can have. 'The practical upper limit for memory space is tied to the number of address bits on chips,' Mannel said. Another limitation is the physical space available within the CPU, he said.
'Some of our customers who already have big-memory databases are looking for something beyond [what they have], but they have power and footprint problems,' Mannel said.
Getting RAM large enough to bring in the entire database (and have room for the analytical software to crunch the information) also means a complete revamping of storage architecture, he added.
James Gray, manager of Microsoft Corp.'s eScience group, downplayed any technology limitations. 'It's not a difficult problem computationally'it's not even unbelievably expensive, but it's not cheap,' Gray said. 'I'm not going to do this on my laptop, clearly. I'm going to use thousands of computers [and] break this up' into pieces.
As for the analytical software, in November 2004 Eric Hazeltine, NSA's research director, told the Technology Council of Anne Arundel County., Md., (where the agency is based) that a company had been awarded a one-year, $445,000 contract to help find or develop new software capable of handling the huge amounts of data the agency collects.
Hazeltine told the audience that NSA invited a representative from a large Silicon Valley-based relational database-mining company to discuss ways of ingesting the data. But the agency found the company had little to offer.
'We told him our problems and he said, 'That's way beyond anything we can do,'' Hazeltine said then.
NEXT STORY: TSA pushes forward with worker ID plan