AI in law enforcement is risky, but holds promise
Connecting state and local government leaders
Leaders should not be reluctant to use AI in controversial applications, even if they risk blowback, one city’s CIO advises.
For some, the use of artificial intelligence in law enforcement might conjure images of the movie “Minority Report,” where the Precrime Division arrests suspects before they can commit any actual crimes.
Others may envision a dystopian surveillance state where residents can be identified by facial recognition technology and tracked through their phone, networked cameras and automated license plate readers.
Local leaders acknowledge imperfect algorithms that try to predict where crimes might occur can make mistakes and perpetuate ongoing biases and stereotypes about certain communities. Some have considered banning facial recognition in public spaces because of its potential for abuse.
But for many, AI’s possible benefits make it worth trying.
In a speech at the International City and County Manager Association’s recent conference in Palm Springs, California, Corona Chief Innovation Officer Chris McMasters outlined the California city’s approach.
McMasters said Corona is mapping where crimes occur to try to determine where hotspots are, and then using AI-driven data analysis to see where to direct policing resources. Identifying crime patterns can be helpful in locating stash houses for drugs, McMasters said, but the technology still needs human oversight to prevent mistakes. The city also wants to launch a real-time crime center, with AI integrated into its cameras to try and spot incidents as they occur.
Navigating the desire to innovate with the responsibility to protect people, especially those groups who have faced discrimination in the past, remains a delicate balance, however.
“We don't want to tie everyone down to where we can't be innovative, and we can't use it,” McMasters said. “Yet we want to make sure that people are safe, and that we are doing it responsibly.”
Meanwhile, McMasters said the police chief is trying to “get ahead” of officers’ desire to use AI when they file reports, by outlining where they can and cannot use the technology.
Integrating AI into law enforcement is a big shift from the tech’s other uses in government, which includes translating government communications, drafting documents, handling traffic signal prioritization and ingesting data to produce insights and analysis. McMasters acknowledged those applications are relatively low risk.
In policing, which directly impacts people’s lives and futures, the stakes are higher.
“Anytime you're looking at criminal-type things, there's a hefty weight to that particular thing,” McMasters said. “There's lots of laws governing what we can and can't do already that are on the books. There's a balance: that risk vs. reward thing. Everyone's trying to figure it out.”
Outside groups are already skeptical of police use of AI. In a policy brief, the NAACP warned that the use of predictive policing and AI within law enforcement agencies can “increase racial biases,” even as they aim to improve “efficiency and objectivity.”
The organization advocated rigorous oversight of the technologies, as well as transparency regarding their use, community engagement and legal frameworks. It also called for a ban on the use of historical crime data and other sources “known to contain racial biases in predictive policing algorithms.”
Lawmakers have voiced concerns too. In a January letter to U.S. Attorney General Merrick Garland, seven lawmakers from both houses of Congress urged the Department of Justice to halt all grants for predictive policing systems “until the DOJ can ensure that grant recipients will not use such systems in ways that have a discriminatory impact.”
The experiences of other cities may offer insight into how AI can be useful for law enforcement. During a panel discussion at the recent Smart City Expo USA in New York City, New Orleans CIO Kimberly LaGrue said AI can be a “force multiplier” and help officers do their jobs better. It can, for example, law enforcement identify individuals subject to temporary restraining orders who continue to frequent the area they are banned from.
McMasters said local leaders should not fear new technology, even in areas like public safety because of potential controversy and backlash among residents.
“My worry is that we’re so risk averse sometimes that we’re not willing to try,” he said.
NEXT STORY: To drive revenue, cities turn to tech to fix their parking problems