House seeks clarity on FBI facial recognition database
Connecting state and local government leaders
Members of the House Committee on Oversight and Government Reform questioned the FBI over its use of facial recognition technology.
The FBI has expanded its access to photo databases and facial recognition technology to support its investigations. Lawmakers, however, have voiced a deep mistrust in the bureau's ability to protect those images of millions of American citizens and properly follow regulations relating to transparency.
Kimberly Del Greco, the deputy assistant director of the FBI's Criminal Justice Information Services Division, faced tough questioning from both sides of the aisle at a March 22 hearing of the House Committee on Oversight and Government Reform.
The FBI’s use of facial recognition technology was called into question last year after the Government Accountability Office issued a report saying the bureau had not updated its privacy impact assessment when the Next Generation Identification-Interstate Photo System “underwent significant changes.”
The FBI’s NGI-IPS allows law enforcement agencies to search a database of over 30 million photos to support criminal investigations. The bureau can also access an internal unit called Facial Analysis, Comparison and Evaluation (FACE), which can tap other federal photo repositories and databases in 16 states that can include driver's license photos. Through these databases, the FBI has access to more than 411 million photos of Americans, many of whom have never been convicted of a crime.
The PIA, along with other documents help the public to understand how the system protects privacy, Diana Maurer, GAO's director of Homeland Security and Justice Issues, testified at the hearing.
The same GAO report said the FBI was not testing the accuracy of its system on a regular basis and has not done testing to ensure that the system provides accurate results for “all allowable candidate list sizes.”
Multiple witnesses, including Jennifer Lynch, the senior staff attorney at the Electronic Frontier Foundation, and Alvaro Bedoya, executive director at the Center on Privacy and Technology at Georgetown Law, said that facial recognition technologies have provided false positives more regularly for women, younger individuals and people of color.
“That is due to the training data that is used in facial recognition systems,” Lynch said. "Most facial recognition systems are developed using pretty homogeneous images of people’s faces, so that means mostly whites and men.”
Rep. Stephen Lynch (D-Mass.), however, argued that the problems were more fundamental than potential biases.
“Innocent people should not be on this database,” Lynch said. “This is really Nazi Germany here. ... They had meticulous files on people of the Jewish faith and that’s how they tracked their people. And I see little difference in the way people are being tracked using one wide net and collecting information on all American citizens.”
Bedoya said Georgetown’s Center on Privacy and Technology recommends having citizens vote to approve their state’s use of driver's license photos in such databases, especially since “most people have no idea this is happening.”
Del Greco said, both in her testimony and in response to multiple questions, that privacy is of the utmost importance to the FBI and that the facial recognition results are used only as investigative leads.
Committee Chairman Rep. Jason Chaffetz (R-Utah) pushed back on that argument. He noted that other biometric data, like fingerprints and DNA, are also used for investigative leads, but said collection of those is much narrower.
“DNA is a valuable investigative tool," he said. "Fingerprints are a valuable investigative lead ... what scares me is the FBI and the Department of Justice proactively trying to collect everyone's face.”
The FBI's failure to update the privacy impact assessment, Chaffetz added, was yet another reason not to trust the agency with ordinary Americans' personal information.
Del Greco said the assessment was submitted to the DOJ. But Maurer said it was submitted only after the technology had been used in real-world applications for years.
“So here’s the problem," Chaffetz said. "You’re required by law to put out a privacy statement and you didn’t and now we’re supposed to trust you with hundreds of millions of people’s faces.”
Del Greco said a privacy attorney at the FBI was advised of the changes in the system along the way, but Chaffetz was unconvinced: “Yeah, well, we don’t believe you.”