CDC moves disease surveillance system to the cloud
Connecting state and local government leaders
CDC's BioSense, originally developed as an anti-terrorism program, is remaking itself to provide a cloud-based platform for federal, state and local health care coordination.
The BioSense program, created by the Centers for Disease Control and Prevention in the wake of the 2001 terrorist attacks, is remaking itself to provide a cloud-based collaboration platform for federal, state and local health officials.
“It essentially is for the rapid, automated collection and dissemination of data,” said Dr. Taha Kass-Hout, BioSense program manager at CDC’s Public Health Surveillance Program Office. “The original mandate was to look at bioterrorism incidents,” but that is being expanded to include all human health-related conditions.
Making a broader range of information available will help public health officials spot, predict, prepare for and respond to outbreaks and events. In the course of this change, CDC has decided that maintaining a centralized infrastructure for gathering and using data no longer makes sense.
Related coverage:
Options expand for online authentication
ID management’s weakness: ‘There is no demand’
ID management's weakness: Few want to use it
That means money can be shifted from IT acquisition and maintenance to lowering the cost of participation for state and local offices. However, the cloud also comes with its own set of security concerns as resources, including hardware, software and data, are moved to a third-party environment.
There is a saying, “you can outsource security, but you can’t outsource responsibility,” said Jon Geater, director of technical strategy at Thales e-Security, an electronics and systems security firm.
Securing data, managing identities and controlling access are different in the cloud but not necessarily more difficult. In theory, the economy of scale and centralized management in a cloud environment could make best security practices such as real-time monitoring, patch management and configuration management easier — if the service provider is doing these things, that is. Not all of them are.
“The cloud is a marketplace,” Geater said. “Some people will be better than others.”
BioSense security
Security and access control are among the criteria that will be considered when the BioSense community makes its decision on a cloud provider later this year. Both the customer and the provider will have to work together to ensure that the appropriate levels of security are maintained to safeguard public but sensitive health data.
“There is a responsibility for the organization to maintain a level of security” for the data it owns, Geater said. But the service provider also must provide appropriate information and tools to the customer. “It’s a shared thing.”
BioSense was authorized in 2002 and established in 2003 to provide early detection and rapid assessment of illnesses and outbreaks that could be an indication of a terrorist attack. That mission is not being abandoned, but the information is being used to improve public health situational awareness by allowing threats to be detected earlier, to support public health practice, and to allow the collaboration of public health officials in different parts of the country that are facing similar situations or have information that is needed by someone else.
Recent uses of the evolving BioSense program include monitoring health problems related to the Gulf oil spill in Alabama, Florida, Louisiana, Mississippi and Texas and tracking possible rabies infections and flu-like illnesses.
BioSense data currently is being collected from local hospitals, health departments, laboratories and the pharmaceutical industry, as well as from federal health care providers, including the Defense and Veterans Affairs departments.
“The data comes in various forms, but primarily in some electronic form,” Kass-Hout said. DOD, VA and some labs have automated systems that pass the information to CDC.
Socialized health data
In the past, CDC has gathered data centrally and produced reports and other products for BioSense participants. The new model will provide a user-centric environment rich with social features for sharing information. Participants will have more control over and access to their data and will be able to form ad hoc teams to collaborate in developing products needed at the time. The collection system also will be standardized so that data will come to CDC from public health departments rather than directly from local hospitals.
“In the new model, we are going to embrace the health department,” Kass-Hout said. “They own that relationship with the hospital,” and it is important that the local departments see the data as it is being passed on. The platform also will provide tools to allow state and local users to slice and dice the data as they need to.
The redesign process began about two years ago, the cloud approach was selected about a year ago and discussions now are under way with service providers. Requirements for the provider will include compliance with basic Federal Information Security Management Act requirements, additional security as required to adequately protect data, and a large enough customer base to ensure the service provider will be around for a while.
RTI International, a research institute based in Research Triangle Park, N.C., has been brought in to help in the redesign while the Association of State and Territorial Health Officials is the lead representative for state and local offices. Design prototypes for the system were developed in collaboration with stakeholders from 22 guiding principles and still are evolving. The idea is to allow collaboration between peers with the ability to download and save reports in a variety of formats.
Kass-Hout said the new platform is expected to be up and running by November.
Trust chain stretched
Although the technology for managing identity and controlling access does not change significantly in a cloud environment, the chain of trust becomes more stretched out, Geater said. Without infrastructure being under the control of a single IT team, “things that were implicitly trusted now have to be explicitly trusted.”
The business model for security and access control also changes. Because the customer suffers the primary loss in the event of a breach, security becomes a secondary business consideration for the provider. “The cost/benefit calculation is different, so the investment decision is very different,” Geater said.
The result is caveat emptor — it is up to the customer to ensure that the service provider provides adequate controls and can demonstrate the controls for an auditor.
Small organizations that cannot afford a dedicated IT team will often will be better off in a cloud environment, because a capable provider will offer better security and better identity management than the small organization can manage on its own, Geater said. But a larger organization with higher requirements will have to look much more carefully to find a provider that meets its expectations.
Tools used for authentication and access control do not change in the cloud. Choice will depend on the level of assurance required, which in turn depends on the sensitivity of the information being protected and the likely impact of a breach. But although the default user ID and password combination will work in the cloud, Geater is not a fan.
“Passwords are easily guessed or stolen and they are hard to revoke,” he said. And if passwords are being sent via the Internet to a cloud, “they are going to be stolen. No question.”
PKI fan base
A flexible — but complex — alternative to passwords is public-key infrastructure, which uses cryptographic keys for authentication and protection of data. “Here at Adobe, we have been big fans of PKI since the late '90s, when we first began incorporating it into our documents,” said John Landwehr, senior director of enterprise security solutions at Adobe Systems.
When embedded in documents, PKI can be used to digitally sign and verify a document, as well as authenticate users through a variety of ID management sources, including Active Directory and LDAP.
Although PKI has been around for quite a while, its drawback is complexity. It is a scheme that uses mathematically related cryptographic key pairs, one public and one private, to encrypt and decrypt data for a wide variety of uses.
The chore of managing these keys has also given it the name painful key infrastructure, Landwehr said. “But there have been advances to ease the pain both of deployment and use.”
Much of the advance has come in government, in the form of the military’s Common Access Card (CAC) and its civilian counterpart, the Personal Identity Verification (PIV) card, both of which incorporate keys for PKI. Tying the keys to ID cards as well as the introduction of an infrastructure that includes card readers and authorities for verifying keys are helping to make PKI a more versatile tool.
One of the advantages of PKI is that it can be used not only to verify the identity of someone accessing online resources but also to authenticate the resources being accessed. This is an important consideration when using the Web to publish official public documents, as the Government Printing Office does. GPO uses digital signatures to verify that information in public documents being viewed such as legislation and public laws has not been changed, a step that can be handled by a Web browser without any effort on the part of the viewer.
PKI for BioSense?
PKI can take the process one step further in a system such as BioSense, which includes sensitive data and which goes beyond one-way publishing. PKI can verify that users are accessing genuine data repositories and that the data has not been modified, and it also can be used to control access to that data contribution of new data.
But despite advances in PKI, it remains largely a stovepipe environment with single-purpose certificates being used in closed populations, such as government employees.
The ubiquity of CAC and PIV within government offers an opportunity to expand the use of PKI to new applications. “The next challenge we’re going to watch is how to roll it out to more of the public,” Landwehr said.
The National Strategy for Trusted Identities in Cyberspace, a federal initiative to establish an “identity ecosystem” with user-friendly credentials that can be widely accepted online, could be a step in this direction.
But to make credentials simple and useful, digital certificates issued by one provider would have to be accepted for a variety of uses, and this will require an infrastructure for federation that would allow verification across a variety of applications. Government has been in a leader in this area also with its Federal PKI Bridge, and the creation of standards for PIV Interoperability, which would allow the use of public digital certificates that could be accepted by government, also is a step in that direction.
But the infrastructure is far from fully deployed, and challenges remain.
“When properly implemented, we’ve seen technology like PKI bring not only cost savings but improve the privacy of the participants and the security of the transaction,” Landwehr said. But the job of building out an infrastructure and managing creating a model for sharing the costs among users, “that’s a tough one."
NEXT STORY: Which cities have the fastest broadband?