Lawmakers Take a Hard Look at Facial Recognition Technology
Connecting state and local government leaders
One expert estimates a quarter of American law enforcement agencies have facial recognition capabilities, but the technology remains largely unregulated.
Local and federal lawmakers are starting to cast a skeptical eye on the use of facial recognition technology by law enforcement, with the San Francisco Board of Supervisors recently voting to ban the practice and a prominent committee in the U.S. House of Representatives launching a series of hearings on the issue.
Privacy and civil liberties advocates have raised concerns about the accuracy of facial recognition software, particularly when used to match photos of minorities, and questioned the legality of using the technology in conjunction with real-time surveillance networks.
With more than 50 million surveillance cameras operational in the United States, incorporating facial recognition technology into the current network of government-owned cameras would “create a near-constant surveillance state,” Neema Singh Guliani, senior legislative counsel for the American Civil Liberties Union (ACLU), told the U.S. House Oversight and Reform Committee this week.
The committee plans to hold another hearing on the technology on June 4 to question representatives of law enforcement agencies. Chairman Elijah Cummings, a Democrat from Maryland, said the committee then plans to “conduct deeper dives on specific issues related to federal law enforcement, state and local issues, and the private sector.”
At the hearing, both Republicans and Democrats said they have concerns about the technology and said they saw a need for government to step in.
"No elected officials gave the OK for the states or for the federal government, the FBI, to use [facial recognition],” said Rep. Jim Jordan from Ohio, the ranking Republican on the panel. “There should probably be some kind of restrictions. It seems to me it's time for a timeout."
No federal regulations restrict law enforcement use of facial recognition technology, but some local governments have begun to regulate its use.
In San Francisco, lawmakers proactively voted this month to ban the use of facial recognition technology by police or other government agencies. It wasn’t in use by city departments.
“There are implicit biases in the algorithms deployed in this technology, which [artificial intelligence] experts have now determined misidentify people of color and women at a disproportionately high rate,” said Supervisor Aaron Peskin ahead of the board’s vote. “Even if the technology is ultimately perfected, facial recognition technology is uniquely dangerous and oppressive. Unlike other technologies, we cannot hide our faces or change what we look like.”
Were the federal government to adopt legislation regulating facial recognition technology, it could potentially restrict its use or require certain safeguards at the local level, said Clare Garvie, a senior associate at the Georgetown Law’s Center on Privacy and Technology. Most state and local law enforcement agencies that already have the technology paid for it with federal grant money, so “Congress has incredible power to decide how much transparency goes into implementing these technologies and what rules are in place,” Garvie told the House Oversight and Reform Committee.
Garvie estimates that at least a quarter of the 18,000 federal, state, county and local law enforcement agencies in the United States use facial recognition as either an investigative or surveillance tool.
Some local law enforcement agencies have used facial recognition technology for years in order to match images with mug shots or other photos already in agency databases.
For instance, police in Annapolis, Maryland used a facial recognition database operated by the state since 2011 to identify the alleged gunman who killed five Capital Gazette newspaper employees in 2018 after the suspect was taken into custody.
According to the Baltimore Sun, more than 10 million images from the Maryland Motor Vehicle Administration, police mug shots and other photographs of arrestees are included in the system.
The ACLU’s Guliani said the FBI has access to driver’s license photo databases in at least 18 states—permissions that were often granted without state lawmakers signoff or notification to the public.
“This was all in secret, essentially,” she testified at the oversight committee hearing. “The people who wanted a driver’s license many times didn’t know these systems were operating either.”
A recent report by the Georgetown Law Center on Privacy and Technology also raised concerns about police use of facial recognition programs that rely on real-time video surveillance in Detroit and Chicago—particularly departments’ lack of transparency about the systems.
A Chicago police spokesperson told WTTW News that the technology is “seldom used” because it isn’t accurate. In Detroit, a city spokesman said to Detroit Metro Times that the police only tap into the technology after a crime has occurred, and are not using it in real time.
Officials with police departments say the technology has too many potential benefits to ignore.
Sgt. Eduardo Bernal, a spokesman for the Orlando Police Department in Florida, rattled off hypothetical examples in which the agency could deploy the technology, from locating a student threatening a school shooting to finding a kidnapped child.
The agency is currently in the second phase of testing Amazon’s Rekognition facial recognition software as part of a pilot program. Amazon shareholders voted last week to continue to sell the technology to government entities and law enforcement despite concerns raised by some investors.
Both Orlando police and the Washington County Sheriff’s Office in Oregon have publicly acknowledged using or testing the Rekognition software.
Amazon would not confirm whether any other other law enforcement agencies are using the software. The company has issued guidelines for public safety use of the technology and a spokesperson said Amazon supports "creation of a national legislative framework covering facial recognition." The Amazon spokesperson said the company has not received any reports of misuse of the Rekognition program by law enforcement agencies, but that it would investigate any such concerns.
Bernal said the program is “still in its infancy” and nowhere near full deployment. The software would have to be compatible with the approximately 200 cameras operated by the department in order to be considered a viable option, Bernal said.
As part of the pilot program, the photos of seven officer volunteers were uploaded into the system for detection through five cameras in department headquarters and three surveillance cameras in downtown Orlando, according to the Orlando Sentinel.
The current pilot phase will end in June, at which point the department will have to decide whether to continue with the next phase of testing, Bernal said.
Bernal acknowledged civil liberty advocates’ concerns that facial recognition technology disproportionately misidentifies people of color, but said the pilot program was not at the point of testing the accuracy of matches. One of the officers who volunteered for the test group was a person of color but is now retired from the agency, he said.
“We are not at that point yet,” Bernal said. “If the program moves forward obviously these are things that are vetted appropriately.”
Andrea Noble is a Staff Correspondent for Route Fifty.
NEXT STORY: State Prohibits Schools From Using Native American Mascots