What’s in store for public safety tech in 2025
Connecting state and local government leaders
In the coming year, artificial intelligence and other tech solutions will be key for understaffed law enforcement agencies looking to keep their communities safe.
Police departments have long grappled with staff, budget and capacity shortages that impede their ability to catch and mitigate crime and deliver justice to communities. Experts say tech, especially artificial intelligence, will help fill worker and service delivery gaps in the new year and beyond.
The exploration and use of automation and analytics among law enforcement agencies ramped up in 2024, particularly as a potential “force multiplier” for short-staffed police agencies, said Jim Burch, president of the National Policing Institute, a research organization.
In the few years post-pandemic, police agencies are steadily recovering from staffing shortages, according to an April survey of 214 law enforcement agencies from the Police Executive Research Forum. Respondent data showed that 11,089 sworn officers were hired last year, up from 8,558 in 2020.
But many law enforcement agencies, particularly larger-sized departments, which the report identified as having more than 250 employees, still struggle to attract and retain law enforcement staff.
“They don’t have the ability to hire more staff to do the things they need to do, whether that be accountability processes, compliance checks or analytics,” Burch said. “I think [police agencies] are really looking forward to a time where technology can be that force multiplier to give them some of the answers they need without having to hire a lot of new staff to do that.”
Indeed a report released earlier this year from Mark43 found that “a majority of U.S. first responders and law enforcement not only support AI integration but also trust their agencies to use it responsibly” and that “law enforcement’s appetite for AI is increasing year over year.”
Burch pointed to drafting police incident reports as one of “the hottest conversations right now” regarding how law enforcement agencies can leverage AI. In 2025 and beyond, authorities will continue to explore not only how AI could help streamline and speed up the reporting process, he said, but also offer more detailed summaries than what human staff could capture.
AI could also play a larger role in other processes like body camera footage review, Burch said. Some agencies across the U.S. have already begun exploring and researching AI’s ability to, for instance, assess officers’ performance during citizen interactions based on camera footage.
But as the hype of AI follows governments into 2025, “there needs to be responsible AI principles that govern the implementation of these tools,” Thomas Randall, lead researcher of AI market analysis at Info-Tech Research Group. Agencies should also consider implementing AI in low-stakes use cases, such as automating nonemergency phone calls, as they experiment with the tech’s potential.
The Mark43 report also underscored how important it is for agencies to develop comprehensive AI governance frameworks to manage risks associated with the technology, like inaccurate or biased results. It also highlighted that public safety agencies should collaborate with trusted AI vendors with expertise in its implementation and compliance.
Burch added that police agencies will maintain focus on other technologies like gunshot detection and license plate reader tech, despite ongoing concerns that such solutions could lead to overpolicing in minority communities.
Several major cities across the U.S., for instance, have stopped using the popular gunshot detection system ShotSpotter citing concerns over its cost, bias and inefficacy.
In 2025, Burch said he anticipates that law enforcement agencies will continue to leverage these tech solutions with a stronger focus on responsible deployment, instead of turning them on and assuming they’ll work.
As long as police departments still lack the numbers to have officers ready on every street to detect and respond to public safety threats, technology like AI and crime detection and prediction tools must plug critical service gaps.
But “clearly, there's acknowledgement of the need to mitigate risk [and] to be accountable for the use of technologies,” Burch said. “My sense is that as we go forward, the encouragement to use the technology in responsible ways will likely outweigh those who are seeking to curtail [their] use.”
NEXT STORY: States stiffen penalties for fentanyl, despite public health concerns