Can chatbots make reporting suspicious activity easier?
Connecting state and local government leaders
Researchers at the University of Nebraska at Omaha will work with Sarpy County to develop a chatbot that walks users through identifying and reporting suspicious activity.
Researchers in Omaha are beginning work on a chatbot they hope will improve the identification and reporting of suspicious activity.
Joel Elson, assistant professor of IT innovation, and Erin Kearns, assistant professor of criminology and criminal justice, both at the University of Nebraska at Omaha, are working with the school’s National Counterterrorism Innovation, Technology and Education (NCITE) Center on the two-phase project.
In the first phase, they will conduct a national survey, a local study and focus groups to learn about barriers to tips reporting and processing. The second involves the development and testing of a chatbot.
The idea for the project came about when Sarpy County, Neb., Sheriff’s Capt. Kevin Griger approached NCITE about facilitating tips reporting through technology. His office uses Awareity, which includes a web form for anonymous tip submission, but hesitation to speak up persists, he said.
People "don’t want to be a snitch," Griger said, or they don’t think someone they know will really commit a violent act. The county receives about 200 tips a year, but he thinks there could be more.
“The whole goal of this is preventing a Columbine, a Parkland, a Virginia Tech,” he said in a university news article. The NCITE project "will help us better understand why and how people are giving tips — why they are not giving tips."
Other reasons for not reporting suspicious activity include not wanting to get involved, fear of repercussions, distrust of law enforcement and not wanting to waste investigators’ time, Kearns said. But technology removes those barriers. “You’re interacting with an interactive computer platform, so you’re not wasting a human being’s time. If you’re familiar with the technology, if you trust the technology, you’re removing that ‘I don’t trust police’ potential barrier that we know people have when it comes to reporting,” she said.
Another reason she and Elson landed on the chatbot idea is that people trust technology. “In information disclosure, people tend to disclose more sensitive, more personal information to machines than they do other people,” Elson said.
The researchers’ goal with the chatbot is twofold. One is to educate the public and increase awareness about what counts as suspicious activity and how and where to report it. To that end, the chatbot will walk users through their concerns and offer logical next steps, which may end with an offer to help them report an incident.
The Department of Homeland Security's “If You See Something, Say Something” campaign launched nationally in 2010, but it is still not widely known, Kearns said. The chatbot can direct people to that campaign and other official information on what constitutes suspicious actions.
“A key thing to focus on is the term chat,” Elson said. “The idea with the chatbot is beyond reporting; the chatbot can be a conversational partner to help you learn, to help you explore and investigate ‘What is something that I saw? Is that worth pursuing more?’ Part of the chatbot’s consideration is to engage people conversationally.”
The other goal is to be a trustworthy resource that puts reporting in users’ control. “Technology is able to multiply exponentially the amount that a single … individual can do,” Elson said. “The trick is designing it right and knowing the key things to design around such as privacy and information transparency. We want people to trust the technology system, or else it becomes just another thing that they don’t trust.”
He expects that the chatbot will be something that law enforcement offices can support without investing in additional technology. Elson said he plans to build it to work with application programming interfaces so that it can be “plugged and played.”
Specifics such as whether it will accept photos, video or map pins will be determined later, based on the survey and research feedback. Everything will be tested against static reporting tools, such as the web form that Griger’s office currently uses, to ensure that the chatbot brings in better-quality information.
“We can develop the best chatbot in the world that people love, but if it’s getting us worse information, that’s counterproductive,” Kearns said.
Officials take tips seriously, Griger said. The Sarpy County threat assessment team has been investigating anonymous reports since 2016. Most tips are school-related, he said, but the team has also been able to avert several suicides.
“It has applications to any of the threats that we get,” Griger said of the future chatbot.
NCITE is a DHS Center of Excellence, and this project is one of 37 prevention-focused research projects the department selected last month for awards totaling $20 million through its Targeted Violence and Terrorism Prevention Grant Program.
Stephanie Kanowitz is a freelance writer based in northern Virginia.