Is banning surveillance tech worth it?
Connecting state and local government leaders
San Francisco's ban on facial-recognition puts the city at risk of not only falling behind on technological innovation, but returning digital workflows to manual processes, experts say.
The San Francisco Board of Supervisors’ vote to ban the use of facial-recognition technology puts the city at risk of not only falling behind on technological innovation, but returning to what one expert described as archaic processes.
The board’s 8-to-1 vote on the Stop Secret Surveillance ordinance on May 14 affects more than facial-recognition tech, however. It defines surveillance technology as "any software, electronic device, system using an electronic device or a device that is used, designed or primarily intended to collect, retain, process or share" a variety of datasets. Those include thermal, olfactory, visual and biometric data, pushing what encompasses “surveillance technology” to well beyond cameras to include cell site simulators, automatic license plate readers, gunshot detection hardware and services, closed-circuit TV cameras and wearable body cameras.
“For me, this is a bit of an overreach,” said Alison Brooks, research director for smart cities strategies and public safety at IDC. “What San Francisco is doing is, in fact, not allowing for the digital workflow to occur. I think that they want people to go back to, for example, sorting through mug shots archaically, manually, and that doesn’t make any sense from a cost-savings perspective or a work time perspective. It’s just going to cost their police force that much more money to connect all those dots.”
Some places are experiencing success with facial-recognition technology. For instance, the Indiana Intelligence Fusion Center has used it to identify criminals in cases involving petty theft and homicides. Additionally, police in Maryland used the technology to identify Jarrod Ramos, who killed five newspaper employees in Annapolis last year, when he refused to state his name.
“It’s not real clear that there’s a good reason for this ban,” said Daniel Castro, vice president of the Information Technology and Innovation Foundation. “There are a whole spectrum of uses for facial-recognition technology, from very simple organizing photos to trying to identify a suspect or a witness or anyone involved in a crime…. There’s lots of benign uses, uses that are completely in line with the manual activity that police do during the day,” such as manually looking through photos or asking the public to help identify someone.
The San Francisco board’s vote is partially in response to studies that have shown that the technology can be inaccurate and racially biased. For example, a recent test of Amazon’s Rekognition software that the company markets to law enforcement, found that it was more accurate in assessing lighter-skinned faces.
“While surveillance technology may threaten the privacy of all of us, surveillance efforts have historically been used to intimidate and oppress certain communities and groups more than others, including those that are defined by a common race, ethnicity, religion, national origin, income level, sexual orientation, or political perspective,” according to the ordinance. “The propensity for facial recognition technology to endanger civil rights and civil liberties substantially outweighs its purported benefits, and the technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring.”
Still, the technology is maturing amid calls for greater accuracy and transparency. The National Institute of Standards and Technology reported in its “Ongoing Facial Recognition Vendor Test” last December that the software has improved significantly since NIST studies of it in 2010 and 2014.
The vote also acknowledges pressure from the American Civil Liberties Union’s West Coast offices, which Brooks said tends to be more protectionist than other locations. Matt Cagle, a lawyer with the ACLU of Northern California, told the New York Times last week that the technology “provides government with unprecedented power to track people going about their daily lives. That’s incompatible with a healthy democracy.”
Part of the problem is that the shortcomings of both the technologies and uses have been conflated, Castro said. For instance, uproar over FBI’s surveillance of protesters in 2015 after the death of Freddie Gray was more about improper or unwarranted surveillance of political protest than of facial-recognition technology itself.
“Because there is already such organized opposition to some of this police behavior, the objections are getting tacked on to facial recognition because I think that’s seen as an opportunity to push back on policing in general,” Castro said. “Some of that pushback is completely legitimate, but it’s conflating the technology with the activities, which I think can be separated out.”
A happy medium exists, but more testing and policies are needed to find it. Castro pointed to a pilot test of real-time surveillance in Orlando in which police officers tracked themselves -- a good example, he said, of how a police department can start moving forward with the cutting edge of technology before applying it to citizens.
Indeed, the San Francisco board included a number of exceptions to the ban. For instance, it doesn’t apply to federally controlled facilities at San Francisco International Airport and the city’s port, nor does it restrict business or personal use of the technology. The ordinance also includes calls for policies that address transparency, oversight and accountability measures as well as rules governing the procurement of surveillance technologies.
Similar bans are under consideration in Oakland, Calif., and Somerville, Mass., and the House Oversight and Reform Committee held a May 22 hearing on facial-recognition software, where committee Chairman Elijah Cummings (D-Md.) said lawmakers largely agree that the technology should be regulated.
“I think this is an area where it would be helpful for Congress to be a little more proactive,” Castro said.