Facial recognition in policing is getting state-by-state guardrails

Peter Cade via Getty Images

Instances of false arrests and privacy concerns are drawing lawmakers’ attention.

This article was originally published by Stateline.

In January 2020, Farmington Hills, Michigan, resident Robert Williams spent 30 hours in police custody after an algorithm listed him as a potential match for a suspect in a robbery committed a year and a half earlier.

The city’s police department had sent images from the security footage at the Detroit watch store to Michigan State Police to run through its facial recognition technology. An expired driver’s license photo of Williams in the state police database was a possible match, the technology said.

But Williams wasn’t anywhere near the store on the day of the robbery.

Williams’ case, now a settled lawsuit that was filed in 2021 by the American Civil Liberties Union and Michigan Law School’s Civil Rights Litigation Initiative, was the first public case of wrongful arrest due to misuse of facial recognition technology (FRT) in policing.

But the case does not stand alone. Several more documented cases of false arrests due to FRT have come out of Detroit in the years following Williams’ arrest, and across the country, at least seven people have been falsely arrested after police found a potential match in the depths of FRT databases.

Williams’ lawsuit was the catalyst to changing the way the Detroit Police Department may use the technology, and other wrongful arrest suits and cases are being cited in proposed legislation surrounding the technology. Though it can be hard to legislate technology that gains popularity quickly, privacy advocates say unfettered use is a danger to everyone.

“When police rely on it, rely on them, people’s lives can be turned upside down,” said Nate Wessler, one of the deputy directors of the Speech, Privacy and Technology Project at the national ACLU.

How Are Police Using FRT?

Facial recognition technology has become pervasive in Americans’ lives, and can be used for small personal tasks such as unlocking a phone, or in larger endeavors, like moving thousands of people through airport security checks.

The technology is built to assess a photo, often called a probe image, against a database of public photos. It uses biometric data like eye scans, facial geometry or distance between features to assess potential matches. FRT software converts the data into a unique string of numbers, called a faceprint, and will present a set of ranked potential matches from its database of images.

When police use these systems, they are often uploading images from a security camera or body-worn camera. Popular AI company Clearview, which often contracts with police and has developed a version specifically for investigations, says it hosts more than 50 billion facial images from public websites, including social media, mugshots and driver’s license photos.

Katie Kinsey, chief of staff and tech policy counsel for the Policing Project, an organization focused on police accountability, said that she’s almost certain that if you’re an adult in the U.S., your photo is included in Clearview’s database, and is scanned when police are looking for FRT matches.

“You’d have to have no presence on the internet to not be in that database,” she said.

The use of FRT by federal law enforcement agencies goes back as long as the technology has been around, more than two decades, Kinsey said, but local police departments increased their use in the last 10 years.

Usually, police are using it in the aftermath of a crime, but civil liberties and privacy concerns are heightened with the idea that the technology could be used to scan faces in real time, with geolocation data attached, she said. Kinsey, who often meets with law enforcement officers to develop best practices and legislative suggestions, said she believes police forces are wary of real-time uses.

Boston Police attempted to use it while searching for the suspects in the 2013 Boston Marathon bombing, for example, but grainy imaging hindered the technology in identifying the culprits, Kinsey said.

Wrongful Arrests

FRT’s role in wrongful arrest cases usually come from instances in which police have no leads on a crime other than an image captured by security cameras, said Margaret Kovera, a professor of psychology at the John Jay College of Criminal Justice and an eyewitness identification expert.

Before the technology was available, police needed investigative leads to pin down suspects — physical evidence, such as a fingerprint, or an eyewitness statement, perhaps. But with access to security cameras and facial recognition technology, police can quickly conjure up several possible suspects that have a high likelihood of a match.

With millions of faces in a database, the pool of potential suspects feels endless. Because the technology finds matches that look so similar to the photo provided, someone choosing a suspect in a photo array can easily make a wrong identification, Kovera said. Without further investigation and traditional police work to connect the match chosen by the technology to a crime scene, the match is useless.

“You’re going to up the number of innocent people who are appearing as suspects and you’re going to decrease the number of guilty people,” Kovera said. “And just that act alone is going to mess up the ratio of positive identifications in terms of how many of them are correct and how many of them are mistaken.”

In the seven known cases of wrongful arrest following FRT matches, police failed to conduct sufficient follow-up investigation, which could have prevented the incidents. One man in Louisiana spent a week in jail, despite being 40 pounds lighter than a thief allegedly seen in surveillance footage. A woman who was eight months pregnant in Detroit was held in custody for 11 hours after being wrongfully arrested for carjacking, despite no mention of the carjacker appearing pregnant.

When Williams was arrested in January 2020, he was the ninth-best match for the person in the security footage, Michael King, a research scientist with the Florida Institute of Technology’s (FIT) Harris Institute for Assured Information, testified in the ACLU’s lawsuit. And detectives didn’t pursue investigation of his whereabouts before making the arrest.

Detroit police used the expired license image in a photo array presented to a loss-prevention contractor who wasn’t present at the scene of the crime. The loss prevention contractor picked Williams as the best match to the security cameras. Without further investigation of Williams’ whereabouts in October 2018, Detroit Police arrested him and kept him in custody for 30 hours.

The lawsuit says Williams was only informed after several lines of questioning that he was there because of a match via facial recognition technology. As part of the settlement, which Williams reached in the summer of 2024, Detroit Police had to change the way it uses facial recognition technology. The city now observes some of the strictest uses of the technology across the country, which is legislated on a state-by-state basis.

Police can no longer go straight from facial recognition technology results to a witness identification procedure, and they cannot apply for an arrest warrant based solely on the results of a facial recognition technology database, Wessler said. Because there can be errors or biases in the technology, and by its users, guardrails are important to protect against false arrests, he said.

Emerging Laws

At the start of 2025, 15 states — Washington, Oregon, Montana, Utah, Colorado, Minnesota, Illinois, Alabama, Virginia, Maryland, New Jersey, Massachusetts, New Hampshire, Vermont and Maine —  had some legislation around facial recognition in policing. Some states, including Montana and Utah, require a warrant for police to use facial recognition, while others, such as New Jersey, say that defendants must be notified of its use in investigations.

At least seven more states are considering laws to clarify how and when the technology can be used: Lawmakers in Georgia, Hawaii, Kentucky, Massachusetts, Minnesota, New Hampshire and West Virginia have introduced legislation.

Like all AI technologies, facial recognition can have baked-in bias, or produce flawed responses. FRT has historically performed worse on groups of Black faces than on white, and has shown gender differences, too. AI is trained to get better over time, but people seem to think that simply by involving humans in the process, we’ll catch all the problems, Wessler said.

But humans actually tend to have something called “automation bias,” Wessler said — “this hardwired tendency of people to believe a computer output’s right as many times as you tell somebody the algorithm might get it wrong.”

So when police are relying on facial recognition technology as their primary investigative tool, instead of following older law enforcement practices, it’s “particularly insidious” when it goes wrong, Wessler said.

“I often say that this is a technology that is both dangerous when it works and dangerous when it doesn’t work,” Wessler said.

Kinsey said that in her work with the Policing Project, she’s found bipartisan support for placing guardrails on police using this technology. Over multiple meetings with privacy advocates, police forces, lawmakers and academics, the Policing Project developed a legislative checklist.

It outlines how police departments could use the technology with transparency, testing and standards strategies, officer training, procedural limits and disclosure to those accused of crimes. It also says legislation should require vendors to disclose documentation about their FRT systems, and that legislation should provide ways to address violations of their use.

The Policing Project also makes similar recommendations for congressional consideration, and while Kinsey said she does believe federal guidelines are important, we may not see federal legislation pass any time soon. In the meantime, we’ll likely continue to see states influencing each other, and recent laws in Maryland and Virginia are an example of a broad approach to regulating FRT across different areas.

Kinsey said that in her meetings with police, they assert that the technologies are essential to solving crimes. She said she believes there is space for FRT — and other technologies used by police, such as license plate readers and security cameras — but that unfettered use can do a lot of harm.

“We think some of them can absolutely provide benefits for solving crime, protecting victims,” Kinsey said. “But using those tools, using them according to rules that are public, transparent and have accountability, are not mutually exclusive goals. They can actually happen in concert.”

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.