Reality Check | A new twist on the DRM
Connecting state and local government leaders
Commentary: Creating an infrastructure that exposes the underlying data on Web sites would go a,long way toward improving information sharing.
Michael Daconta
IN A RECENT ESSAY, 'Government Data and the Invisible
Hand' (GCN.com/1226), a group of renowned
Princeton University professors proposes that the next
administration achieve government transparency by requiring public
government Web sites to use 'an infrastructure that
'exposes' the underlying data.'
That is the same guidance that the Federal Enterprise
Architecture Data Reference Model makes to improve cross-agency
information sharing. The two approaches attack the problem from
different sides of the same information-sharing coin: one from an
existing infrastructure perspective (bottom-up) and the other from
public Web sites (top-down).
But if DRM 2.0 was released in December 2005, why do the
Princeton professors need to write their article at all? Why are we
not already there?
The main reason is that, with a few exceptions, the DRM has no
effective forcing function ' that is, a means of making
agencies get on board. As a result, it languishes. We have the road
map to where we need to go, but not many are willing to pay for the
gas to get there.
In many ways, the problem with information sharing is the same.
It's a nice idea that is easy to look at as a luxury and not
a necessity ' thus the need for a forcing function. At the
Homeland Security Department, the forcing function for the DRM was
the requirement to use the National Information Exchange Model for
the agency's exchange layer. This requirement has been
effective and drives the entire data management effort. But what
about agencies that don't use NIEM?
A possible tack is to align information management with a
service-oriented architecture initiative. However, most SOA
initiatives are in their infancy and too immature to make a
significant impact. Thus, we are left with architecture for
exposing data for information sharing without the impetus to
implement it.
So how do we break this logjam? That is where the Princeton
proposal shines. It recommends that public Web sites publish data
based on an infrastructure that is identical to the one proposed in
the DRM. That is an important strategic shift from the distributed
manner in which it is done now.
Many government Web sites are manual creations ' they are
separate from the information technology systems that produce the
data. Government Web sites are more a public relations function
than an IT function. Answering new questions that crop up in
response to world events involves a manual fire drill in which an
army of administrators people scurry around collecting information
to publish on the Web site. That process is notoriously unreliable
and inefficient. And it is analogous to the state of information
sharing before the 2001 terrorist attacks.
The DRM was launched in response to the need for automated
information sharing. It lays out a framework that creates an
infrastructure to expose the underlying data, as the Princeton
professors recommend. That framework should be created and
connected to public Web sites as just another flavor of information
sharing.
In other words, the infrastructure that the Princeton professors
recommend is the same information-sharing infrastructure required
to successfully implement the mandates of the Intelligence Reform
and Terrorism Prevention Act of 2004.
That makes a lot of sense because it is an excellent forcing
function for both public Web sites and a DRM-driven
information-sharing infrastructure.
Sharing data with the public is no different in terms of data
quality, timeliness and relevance than sharing data with external
agencies. They are just another consumer of data with a different
privilege set.
Thus, we have an excellent solution that can potentially kill
two birds with one stone: the public's need for government
transparency through exposed data and the government's need
for coordinated action through information sharing. I hope the next
administration heeds the professors' sage advice.
Daconta (mdaconta@acceleratedim. com) is chief technology
officer at Accelerated Information Management and former metadata
program manager at the Homeland Security Department. His latest
book is titled 'Information as Product: How to Deliver the
Right Information to the Right Person at the Right
Time.'
NEXT STORY: Shady Internet registrars could get the boot