Smarter camera systems for better surveillance
Connecting state and local government leaders
Software that builds video analysis intelligence into cameras will provide earlier alerts to public safety managers.
Using a $1.57 million grant from the National Institute of Standards and Technology, a University of Houston researcher is working to make surveillance cameras smart enough to recognize and alert public-safety managers at the first sign of threats.
The project is a natural outgrowth of computer science professor Shishir Shah’s work, which has focused on enabling cameras to understand human behavior and recognize objects through algorithms that can automatically ascertain what a video camera is seeing.
The catch, Shah said, is that the algorithms are based on certain assumptions. For instance, researchers assume the distributed network of cameras is always collecting data about the locations under observation, that the information is useful and that the quality is something from which an algorithm can extract information.
In realistic situations and practical deployments, however, those assumption may not always be true, Shah said. “Cameras are like any other piece of infrastructure. They have certain operating characteristics, operating ranges," he said.
A network that's heavily used, for example, may limit the amount of data a video camera can push to a recording device. That may "compromise the quality of recording we have, which in turn, makes it difficult for the algorithm to process that information,” he added. At other times, the cameras may get smudged or otherwise lose focus, resulting in degraded video quality.
The first phase of Shah's three-part project addresses those kinds of low-level analysis problems, such as suboptimal conditions. He's working to get cameras to send an alert when images are out of focus or something blocks their view.
The second phase looks at patterns. For example, a camera observing a roadway may watch traffic that goes in a certain direction, although it cannot pick up images of individual cars. “I can learn over time that the motion is always happening, let’s say, left to right,” Shah said. “All of a sudden, one fine day you see a motion happening right to left. That could be a mid-level analysis resulting in some kind of alert.”
Complex or high-level analysis adds a layer to the patterns by associating them with semantics. For instance, defining through object-level analysis what is making that motion -- cars or people.
Because it’s difficult to define the full span of activities that might take place in a camera’s line of sight, Shah said he opted to home in on “normal observations” within a geographical space. “The normal observation is something that has to be learned,” Shah said. “If I can learn what normal is, I stand a better chance of identifying deviations from that normal.”
Currently, Shah uses a centralized server for the analysis, but he said he hopes to do the computing on the edge in the future, particularly for low-level analysis. Additionally, part of the research will study how the algorithms will scale relative to bandwidth. Some of the cameras will be wireless while others will be wired, each bringing its own bandwidth concerns.
It’s worth noting, too, that Shah and his team are not designing alert software, but rather looking to integrate the video analysis results into the city of Houston’s existing infrastructure. To that end, Shah is working closely with the city’s Public Safety Video Initiative, which started 10 years ago to monitor critical infrastructure as part of the Homeland Security Department’s Urban Areas Security Initiative.
“It’s really what we call a verification system,” said Jack Hanagriff, law enforcement liaison for the city. “It’s not surveillance, it’s not a crime-hunting system. It’s basically just to verify something was going on.”
Today, more than 900 cameras are deployed, and the city contracted with Vidsys to provide the middleware to integrate the cameras with other networks, such as those associated with stadiums, water treatment plants and other potential terrorist targets. Departments such as emergency management agencies, some public works entities and the Parks and Recreation Department can now share video data.
“We have a philosophy: The more eyes on the system, the better. So we try to get people [involved] who could have a vested interest,” Hanagriff said.
After hosting two major sporting events -- the Super Bowl in February and the 2016 NCAA Division I Men’s Final Four basketball tournament -- city officials decided they needed a secondary monitoring system in place for some of these venues, Hanagriff said. As a result, Houston expanded its partnership with Vidsys and started working with other companies such as Verizon and Access Communications, which donated $80,000 worth of cameras to the city for use during the Super Bowl. Now officials are deploying those cameras in areas that lacked coverage.
“We’re creating a technology sandbox, which I call a playground because everybody gets to play on it,” Hanagriff said. ““Public safety officials get exposed to new technology without disrupting our existing network. Industry partners get access to subject-matter experts, and they get to demonstrate their technology to other stakeholders and private business partners.”
Editor's note: This article was changed Aug. 23 to clarify the timeline of Vidsys' partnership with Houston.
NEXT STORY: Better research results with VR