Why not use a supercomputer to process health care records?
Connecting state and local government leaders
The U.S. government operates the most powerful supercomputers in the world. Using some of that power to process government health care records would be difficult to set up, but the benefits could be worth it, says William Jackson.
The application of information technology to health care records promises to improve not only medical care but also health care processes, as critical information is digitized for greater portability and easier access. Medical claims processing has been computerized for years, but it remains a fragmented and time-consuming process filled with inefficiencies.
The U.S. government is the nation’s largest health care provider, and its system of paying first, evaluating claims later, and finally chasing down inappropriate payments months after the fact has resulted in a level of waste estimated in the hundreds of billions of dollars a year.
But the government also operates the most powerful supercomputers in the world. Why not use some of that power to process government health care records?
That is the idea of Andrew Loebl, a senior researcher at the Computational Science and Engineering Division at Oak Ridge National Laboratory. The benefits would be threefold: Real-time processing of claims as a whole could help identify fraudulent, wasteful and inappropriate claims before they are paid; it could provide a better understanding of the relationship between treatment and outcome; and because the data would remain in government hands, it could ensure privacy.
The technology is not a problem, Loebl said. Oak Ridge’s Jaguar supercomputer, ranked the fastest in the world at 2.3 petaflops -- a petaflop is a thousand trillion floating-point operations/sec -- has more than enough memory and processing power to handle a year’s worth of claims in minutes without interfering with ongoing research. And the software to process and analyze claims already exists.
“There’s nothing rocket science about this,” Loebl said. “None of this will take any sophisticated software.”
The hard part would begetting the myriad government health programs, from Medicare and Medicaid to those run by the Defense and Veterans Affairs departments, to combine their data into a single pool for processing and analysis.
“The idea is unbelievable to the decision-makers,” Loebl said. “We’re a long way from persuading people that this is practical.”
Such a program would not completely reform government health care delivery and payment systems, of course. The folks at Oak Ridge might know nuclear fission and supercomputers, but they are not physicians, and they would not be handling the actual claims payments and enforcement. There still would be plenty of work for administrators and contractors to do. But the economies of scale possible from applying supercomputing power to heretofore distributed business processes could help to stem waste, fraud and abuse in the health care system while protecting the integrity of sensitive information.
Combining data from a variety of programs into a single coherent stream for processing would not be a trivial task. It would involve a level of cooperation not often seen among agencies, in addition to substantial changes in business policies and processes. It is likely that legislation would be necessary to allow or require this. But the potential benefits could well be worth the effort.
Administrators and legislators should take a careful look at Loebl’s idea, determine its feasibility, evaluate costs and benefits, and decide whether it is worth moving forward with. After all, as Loebl said, it’s not rocket science.