Framework would make AI decisions less murky
Connecting state and local government leaders
A new framework would allow users to understand the rationale behind artificial intelligence decisions.
A new framework would allow users to understand the rationale behind artificial intelligence decisions.
The work is significant, given the push to move away from “black box” AI systems -- particularly in sectors like the military and law enforcement, where there is a need to justify decisions.
“One thing that sets our framework apart is that we make these interpretability elements part of the AI training process,” says Tianfu Wu, first author of the paper and an assistant professor of computer engineering at North Carolina State University.
“For example, under our framework, when an AI program is learning how to identify objects in images, it is also learning to localize the target object within an image, and to parse what it is about that locality that meets the target object criteria. This information is then presented alongside the result.”
In a proof-of-concept experiment, researchers incorporated the framework into the widely-used R-CNN artificial intelligence object identification system. They then ran the system on two, well-established benchmark data sets.
The researchers found that incorporating the interpretability framework into the AI system did not hurt the system’s performance in terms of either time or accuracy.
“We think this is a significant step toward achieving fully transparent AI,” Wu says. “However, there are outstanding issues to address.
“For example, the framework currently has the AI show us the location of an object those aspects of the image that it considers to be distinguishing features of the target object. That’s qualitative. We’re working on ways to make this quantitative, incorporating a confidence score into the process.”
The researchers will present the paper at the International Conference on Computer Vision in Seoul, South Korea.
Support for the work came from the Army Research Office and Defense University Research Instrumentation Program, as well as from the National Science Foundation.
This article first appeared on Futurity.