It's time for consensus on facial recognition in law enforcement
Connecting state and local government leaders
Like fingerprint identification technology, facial recognition has the potential to significantly improve the effectiveness and efficiency of police investigations.
The FBI began using Automated Fingerprint Identification Systems (AFIS) in 1975. Over the past 44 years, the technology has become a reliable investigative tool, deployed within a procedural framework that empowers law enforcement while protecting public interest.
Now there’s a new tool, facial recognition, and like fingerprints it has the potential to significantly improve the effectiveness and efficiency of police investigations.
That’s particularly true in cases of violent crime, burglary, kidnapping and missing persons, where establishing a timeline and identifying suspects as quickly as possible is key.
Today, some of our nation’s most heinous cases remain unsolved far too often -- according to the most recent FBI universal crime reporting data, less than 50% close with an arrest. It’s time to empower law enforcement agencies with facial recognition so they can do what they do best: keep our communities safe.
While the Center for Data Innovation found that almost half of Americans are comfortable with police use of facial recognition under certain circumstances, there remain many vocal opponents who, I believe, are unaware of the realities of investigative work and have a pessimistic view of law enforcement agencies’ commitment to preserving the public's rights.
Armed with a better understanding of how biometrics are used today and how facial recognition can be deployed responsibly -- adhering to existing guidelines for other biometric investigation technologies -- I think those opponents and most Americans would be comfortable with police use of facial recognition.
In the past, forensic investigators would pull latent fingerprints from the scene of a crime and manually compare them to known prints. Today crime scene prints are digitized, and specially trained fingerprint examiners mark the unique features, or “minutia,” on the print before submitting it for an AFIS search.
The fingerprints in AFIS are collected directly from apprehended criminals. The search uses complex algorithms to compare the minutia of the crime scene print against thousands and sometimes millions of previously collected, known prints, returning potential matches much more quickly than the manual searches of the past.
Law enforcement agencies that want to use facial recognition should adhere to the same procedures they do with fingerprints. At a crime scene, investigators would recover videos or still images, if available, and prepare them for processing in an automated biometric identification system. An ABIS is simply an expanded database, which could include facial images, fingerprints and other biometric modalities.
As is the case with AFIS today, facial searches in ABIS would only return a list of potential matches ranked according to an accuracy metric known as “match certainty.” These potential matches are then evaluated by a trained (human) examiner who ultimately makes an identification decision.
Match certainty thresholds for biometric systems are determined through collaborative testing across law enforcement agencies, technology vendors and the National Institute of Standards and Technology, which is widely considered the global authority on biometric systems’ accuracy and effectiveness. As facial recognition systems have matured over the past 20 years, NIST has been heavily involved in determining appropriate thresholds for system accuracy.
In the most recent NIST tests against a database of 12 million mug shots, the top performing recognition algorithm failed to correctly match a face in just 0.45% of searches, and falsely identified a face even less often. The difference in accuracy between the top performers and the next 50 or more algorithms is typically tenths or event hundredths of a percent.
It’s important to understand how miniscule that variance is in the accuracy rates of different algorithms in tightly constrained tests because, while law enforcement agencies should absolutely seek out facial recognition technologies that independent testing determines are very accurate, agencies must also consider the importance of data storage and processing costs, data security and match speed. No technology is perfect, and law enforcement agencies understand that, which is why human intervention is required for identification decisions and other verification procedures. For the same reason, independent testing of all factors of a biometric system’s performance is critical for law enforcement purchase decisions.
Unfortunately, some agencies have chosen to use facial recognition technologies that have not been tested by NIST, and they have been rightly criticized for doing so. Companies that claim their systems can’t be separated from other proprietary technologies for independent testing call into question their compatibility with trusted investigation protocols, the data used to develop their matching algorithms, and their viability as secure, fast and cost-effective tools for investigation that can perform reliably without dependence on a private company.
Facial recognition algorithms that haven’t been independently tested and scored by NIST should be disqualified from law enforcement agencies’ consideration.
Facial recognition can be deployed to great effect as a lead generation tool. But it must be done in a way that limits law enforcement agencies’ intrusion into public life and requires human expertise in final decision-making, just as it’s been done with fingerprints for decades. Law enforcement agencies must be proactive and specific in communicating the methods by which they’ll use facial recognition technologies and their goals for doing so. Through collaborative and transparent efforts to responsibly deploy powerful biometric technologies, the public can remain confident that, in the hands of law enforcement agencies, facial recognition serves their interest in safety while protecting their rights to privacy.
Editor's note: This column was changed May 16 to correct the name of the National Institute of Standards and Technology.
NEXT STORY: Uber launches tool to track traffic speeds