AI is helping police solve more crimes, but some are still worried
Connecting state and local government leaders
At a recent Senate hearing, concerns were raised about false arrests and how little is known about the accuracy of some AI products.
Miami’s assistant police chief told senators at a hearing Wednesday that the department has been able to solve far more murders and violent crimes in the city since it started using artificial intelligence. But reports that the use of AI by police departments has led to false arrests is also prompting questions by some about the new technology.
An official with the Government Accountability Office said during her testimony at the hearing that it is unknown how accurate the brands of AI software used by police are.
The hearing came a week after 18 Democratic senators raised concerns that the Justice Department is not doing enough to regulate the use of AI by police.
“We are deeply concerned that facial recognition technology may reinforce racial bias in our criminal justice system and contribute to arrests based on faulty evidence,” the senators, led by Raphael Warnock of Georgia, wrote in a letter to Attorney General Merrick Garland.
One story shared shared in the letter was that of a Black man in Georgia who was arrested for using stolen credit cards to buy designer purses in Louisiana. The man remained behind bars for six days before authorities realized his arrest was the result of a bad facial recognition match and that he had never been to the state.
New Jersey Sen. Cory Booker, who also signed the letter, said at the Judiciary subcommittee hearing that when he was mayor of Newark he saw how new technologies like body cameras and ShotSpotters, used to detect gunfire, can improve law enforcement. But he called for police to take steps recommended by experts as “common sense precautions.”
Among those steps for the Justice Department are to make sure law enforcement agencies receiving millions in federal funding for AI use software that has been independently shown to be accurate.
How AI Is Helping Police Departments
Despite the concerns, Armando Aguilar, the Miami assistant police chief, said AI has helped his city become “safer today than in any other time in our history.”
The department, Aguilar told senators, uses AI in a number of ways, including facial recognition, reading license plates, monitoring potential threats on social media and using ballistic evidence to “connect the shots” between shootings.
Aguilar also cited a Florida International University study that said detectives have a 66% greater chance of finding suspects in violent crimes when using the technology.
Before AI, he said, the police department was only able to arrest suspects in 45% of murders and less than 38% of violent crimes. However, after it started using the technology in 2023, it solved 68% of the murders that occurred that year and 58% of the violent crimes.
Not using AI, Aguilar said, would not only mean that there would be “more crimes that would go unsolved, [but also that] criminal suspects would be allowed to continue to victimize other people.”
The assistant police chief also said the technology frees up officers to work on more cases. Being able to, for instance, identify a suspect by comparing their photo with a database of photos ”can cut out hours, even weeks worth of time,” he said.
Aguilar, though, said that the Miami Police Department created policies for using the technology after hearing concerns from the community during public hearings.
Some of those policies, for instance, require more than the identification of someone through facial recognition as enough evidence to arrest them. Rather, Aguilar said an AI-based facial match would be similar to receiving a tip that someone may have committed a crime. “Our detectives would do their due diligence and either try to discount or corroborate it with other evidence that could tie that person to the crime scene.”
“This is absolutely not a substitute for traditional investigative methods,” he said. “It complements traditional investigative methods.”
Aguilar’s point was emphasized by the National Sheriffs’ Association, Major County Sheriffs of America and the Major Cities Chiefs Association in a letter to Booker and Arkansas Sen. Tom Cotton.
“It is essential to recognize that AI-powered technology serves as an investigative assistant to law enforcement rather than a replacement for the human element,” the associations wrote, saying they would oppose “blanket” regulations.
Cotton, a Republican, echoed the groups’ sentiment. “What AI is in law enforcement is not Robocop,” he said. “It's not the Terminator, or The Matrix.”
Concerns About AI in Law Enforcement
Still, concerns remain about the use of AI in policing.
In their letter to the Justice Department, the 18 Democratic senators said there have been six cases, including the one in Georgia, where people were falsely arrested “based on little or nothing more than an incorrect facial recognition match.” They noted that in all six cases, the individuals were Black.
The senators cited a National Institute of Standards and Technology, or NIST, study that found that facial recognition technology is less accurate when analyzing dark-skinned faces or those of Native Americans and Asians. The senators also noted that it would violate civil rights laws for the department to give out grants to “deploy programs or technologies that may result in discrimination.”
Karen Howard, the GAO’s director of science, technology assessment and analytics, said that some police officers may need better training to understand AI's limitations and called for creating national standards for training police in using the technology.
To illustrate the need for training, Howard gave an example of when a detective tries to use AI to compare a fingerprint taken at a crime scene to others in a database of prints. All AI will do is tell the detective in what fingerprints in the database are the closest match. But it does not say how likely it is that the fingerprints match.
“That's all the algorithm is able to do,” Howard said.
In other words, there might only be a 35% chance that the fingerprints match. But “analysts often do not understand that,” she said. “They often think this is my number one candidate. Or these are my top 10. It's got to be one of these people. In reality, it might be somebody completely different.”
In the Georgia case, Randal Quran Reid was arrested because the police used facial recognition to identify three suspects from video taken by the store. Reid's attorney alleged that the detective issued a warrant for Reid’s arrest “without doing anything else to determine whether or not Quran was actually the individual."
Another concern identified by Howard is that while NIST evaluates software, the federal agency can only examine algorithms that are submitted by the company that produces them. University of California law professor Rebecca Wexler said she and other researchers have sought permission from one AI company to evaluate their software but was denied permission.
Howard called for more AI to be evaluated by independent researchers. While the best are “highly accurate and effective,” she said, “there are hundreds of algorithms in development and on the market. Performance can vary significantly among them.”
According to studies, she said, “the best results may be obtained from the combination of a highly accurate algorithm and a well-trained analyst.”
Kery Murakami is a senior reporter for Route Fifty, covering Congress and federal policy. He can be reached at kmurakami@govexec.com. Follow @Kery_Murakami
NEXT STORY: Fake Biden robocall to New Hampshire voters highlights how easy it is to make deepfakes