Facial Recognition Software Incorrectly Flags 26 State Lawmakers as Criminals, ACLU Says
Connecting state and local government leaders
One California lawmaker said a recent test of Amazon's technology is proof that it should be kept from body-worn police cameras. The company says the testing method used by the ACLU wasn't fair.
Twenty-six lawmakers in California were incorrectly matched with mugshots in a recent test of Amazon’s facial recognition software conducted by the American Civil Liberties Union of Northern California.
The ACLU ran photos of all 120 members of the California State Legislature through Rekognition, Amazon’s facial recognition software, which matched roughly 20 percent of them to mugshots in a separate database. Assembly member Phil Ting, one of the lawmakers falsely identified as a criminal, said at a press conference Tuesday that the experiment illustrates the limitations of the technology, which particularly struggles to correctly identify people of color—especially women.
“We wanted to run this as a demonstration about how this software is absolutely not ready for prime time,” he said.
Ting, a Democrat from San Francisco, is the primary sponsor of a bill that would ban law enforcement agencies in California from using facial recognition technology in body-worn cameras.
Deploying the software there, Ting argued, would turn the cameras from a tool designed to increase trust and transparency into one of constant surveillance, which could have devastating results for people who are arrested as a result of being misidentified.
“It’s no laughing matter if you are an individual who is trying to get a job, if you are an individual trying to get a home,” he said. “There are real people who could have real impacts.”
The federal government so far has not regulated facial-recognition technology, leaving states and cities to construct their own restrictions. A few cities, including San Francisco, have prohibited all departments from using the software. The state of Oregon has banned police from deploying the technology in body-worn cameras. Ting’s bill goes further, banning the use of all "biometric surveillance technology."
The California test comes a year after the ACLU conducted a similar experiment using photos of members of Congress. In that test, 28 lawmakers matched up with mugshots of other people.
Amazon criticized the ACLU's method of testing their technology. In both cases, Amazon said, the fault of the false matching lies not with the software, but with the settings that were used.
Matt Cagle, an attorney for the ACLU, said the organization used the software’s default parameters, which match at an 80 percent confidence rate. For use by law enforcement, Amazon recommends at least a 99 percent confidence rate, which uses a higher threshold to generate a match and will by default return fewer photos.
“The ACLU is once again knowingly misusing and misrepresenting Amazon Rekognition to make headlines,” a spokeswoman said. “As we’ve said many times in the past, when used with the recommended 99% confidence threshold and as one part of a human driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking.”
Ting’s bill passed the Assembly in May and is currently awaiting a vote in the Senate.
Kate Elizabeth Queram is a Staff Correspondent for Route Fifty and is based in Washington, D.C.
NEXT STORY: How One City Saved $5 Million by Routing School Buses with an Algorithm