How AI is transforming state and local investigations

EvgeniyShkolenko via Getty Images
COMMENTARY | The technology can help process data quicker and sift through what can be distressing evidence. But agencies must tread carefully and have the right policies and training in place to be successful.
Artificial intelligence is transforming crime — for both bad actors and investigators. In the last year, 30% of law enforcement professionals have seen an uptick in AI being used by criminals, according to an Industry Trends Survey published annually by Cellebrite.
At the same time, 61% of the same professionals view the tech as a powerful tool to help improve the efficiency and accuracy of investigations.
These days, almost everyone carries a smartphone. When a crime is committed, this device acts as a digital witness, revealing where a person goes, who they talk to, what they search for, and often just as important, what they delete. This digital evidence creates a mountain of data for public safety agencies to sift through.
New AI solutions have emerged in recent years to help state and local law enforcement teams quickly analyze these high volumes of data, identify patterns in digital evidence and accelerate investigations. These capabilities are especially helpful in cases of internet crimes against children, and those involving child sexual abuse material where predators are using AI to create convincing false identities as well as alarmingly realistic explicit deepfake images and videos.
Accelerating Investigations With AI
More than 90% of investigations involve digital evidence with most of that coming from smartphones, and 69% of investigators report they don’t have enough time to review all the data from these devices in their cases — underscoring the need for effective AI-powered technology to help drive efficiencies, especially in smaller agencies.
In online child exploitation cases, there are typically large volumes of disturbing digital evidence, which is time-consuming to review — creating extensive backlogs — and can also be psychologically taxing for investigators. With AI-powered image detection and analysis, law enforcement can quickly reach actionable evidence in these cases, creating more efficient investigations and safeguarding investigators’ mental well-being.
AI can also expedite tedious, time-consuming tasks that otherwise strapped agencies may not be able to adequately resource. For complex cases like narcotics investigations, AI automates data analysis, categorizes information and can quickly identify relevant images, all of which help investigators identify key people and places that sometimes piece together full scopes of criminal networks — work that would have otherwise taken much longer. New generative AI capabilities like chat summarization, photo-tagging and browsing history analysis further assist in establishing these kinds of connections.
Cloud-based platforms, driven by AI, can also reduce hours spent on tasks such as downloading and copying data onto thumb drives and transporting the evidence to stakeholders. All of this time saved frees up investigators to best allocate their energy and effort toward more complex parts of their investigations.
From data collection to analysis, AI can work as an assistant to help speed up a team’s time to evidence, using the technology to augment time-consuming human tasks.
Best Practices For Agencies
As more state and local law enforcement agencies invest in AI and integrate it with their workflows, there are several best practices teams should adopt to ensure successful AI implementation and outcomes.
Create an ethical AI policy: Work with your legal team and get educated about AI to strategically outline the parameters where the use of AI is acceptable as a force for good. State and local agencies must be mindful of data privacy and maintain a strong chain-of-custody. Teams should prioritize responsible and transparent use of AI and only select technology that operates within the same ethical guidelines.
Select the right AI tools: Invest only in technology that is capable of automating the most tedious work and providing actionable insights to accelerate investigations. For agencies handling crimes involving CSAM, AI-powered tools can help protect the investigator by reducing exposure to harmful content through automated categorization of images.
Proper training for law enforcement: Training is key in the ethical implementation of these tools in agencies across the country. While the tools available may differ agency-to-agency, it’s critical to stay up to date on the technology being deployed through conferences, training seminars, digital forensics courses and simulated opportunities that put teams’ skills to the test. These hands-on training efforts also help ensure digital evidence is admissible in court.
Ensuring human oversight and analysis: To successfully enable more efficient investigations with AI, agencies should ensure human oversight verifies the tech’s findings before it’s used in investigations. For example, while the technology can flag unusual patterns in data, it can’t identify if the activity is the result of criminal behavior nor determine intent or motive.
Criminal use of AI will only increase. To keep pace, state and local agencies must ensure their AI strategies are comprehensive, current and considerate of human expertise.
With the right ethical tools, a willingness to adopt new innovations, robust training regiments and appropriate human oversight, law enforcement can solve complex cases more efficiently while fostering trust in the communities they protect.
Jared Barnhart is the Head of Customer Advocacy at Cellebrite, a global leader in premier Digital Investigative solutions for the public and private sectors. A former detective and mobile forensics engineer, Jared is highly specialized in digital forensics, regularly training law enforcement and lending his expertise to help them solve cases and accelerate justice.