Memo to health IT planners: no need to reinvent standards
Connecting state and local government leaders
Columnist Mike Daconta helps clarify the President's Council of Advisors on Science and Technology's recommendation for a universal exchange language for health care information and explains why it is more than feasible today.
In a report to the president in December 2010, the President’s Council of Advisors on Science and Technology (PCAST) called for “the nationwide adoption of a universal exchange language for healthcare information and a digital infrastructure for locating patient records while strictly ensuring patient privacy.”
For me, this was déjà vu given that in 2005, I launched the National Information Exchange Model in conjunction with Jim Feagans and Pat McCreary at the Justice Department as a method to integrate homeland security efforts across federal, state, local and tribal governments.
Through the continued focused efforts of the succeeding NIEM program managers — Kshemendra Paul and Donna Roy — and many hard working staff members, NIEM is working on a scale of millions of messages today.
In direct contradiction to this success story, a working group of the Health IT Policy and Standards Committees has concluded that the PCAST recommendation is not feasible by 2013 and, even worse, that they “are unaware of any real-world environments (either in healthcare or other sectors) where the combinations of technologies envisioned for the end-state have been placed into operation.”
Are you kidding me? Given that I began training and implementing Extensible Markup Language standards and software to process them since 1996, that the Extensible Business Reporting Language has been extensively and successfully used by the Securities and Exchange Commission since 2008, that the Federal Aviation Administration successfully uses the Aeronautical Information Exchange Model, that the mortgage industry has many commercial implementations of its XML standards, that the Defense Department has hundreds of widely adopted XML formats and that Recovery.gov and E-Verify successfully use NIEM — the list of real-world examples just goes on and on and on.
Standards misunderstood
So, although I could decry the lack of leadership, the meekness of its conclusions, the circling the wagons of industry representatives and possibly even throw in a greed-driven conspiracy theory or two — I won’t bother throwing out such red meat. Instead, I believe the root problem is a misunderstanding of the evolution of standards.
Your father’s data standards (pre-XML and pre-Web) were often overly ambitious (“let’s boil the ocean”), took far too long to develop (“if everyone would just do it my way”) and resulted in thick, paper volumes that sat on dusty shelves or became door stops.
Today, where Asynchronous JavaScript and XML-driven websites and service-oriented architectures are thriving, an XML-based standard can rapidly move from committee approval to system implementation in weeks.
Recently, I used a software utility to automatically generate all the parsing, marshaling and unmarshaling code for a complex OMB Exhibit 300 XML schema. By hand, such software development would have taken a month of boring, grunt-like coding. The bottom line is that during the past decade, there has been a sea change in the tools, techniques and understanding of how to develop standards for immediate adoption, implementation and use.
Given that, standardization before mass adoption must be the rule and not the exception. Additionally, as evident by the plethora of XML-based cloud application programming interfaces, the cloud is well suited to take advantage of this new, rapid standards environment.
An interesting, recent analogy that is testing this principle is the North American electric vehicle manufacturer’s agreement on a standard electric plug design.
That effort and rapid consensus follows the same logic that is necessary for electronic health records: standardization before mass adoption is smart economics. If you think about the integration challenge ahead for EHRs to be effective — data flowing across organizations of every shape and size, from small mom-and-pop operations to huge multistate health conglomerates — it seems obvious that interoperability is paramount.
Thus, for EHRs to be interoperable, a universal exchange language must also be standardized before widespread adoption. Simply put, the PCAST recommendations are feasible — today.
NEXT STORY: A monster performer that won't overheat