Moving storage to the cloud? Don't forget about security.
Connecting state and local government leaders
Agencies should look for storage providers that support their security and privacy requirements as they move data into cloud infrastructures, an industry expert with Amazon Web Services advises.
Agencies should look for storage providers that support their security and privacy requirements as they move data into cloud infrastructures, an industry expert with Amazon Web Services advises.
The duplication of data across multi-tenant environments will be a growing concern for agencies moving big storage to the cloud.
Related coverage:
Tips for replicating data in the cloud
Virtualization makes replication easier
As a result, agencies should choose a data “replication methodology and cloud storage system that supports their security and privacy requirements, including encryption of data in flight and at rest,” said Mark Ryland, a solutions architect with Amazon Web Services.
Replication technology duplicates stored or archived data in real time over a storage-area network.
More than a hundred government agencies are using AWS, and a large number of those agencies are using AWS for disaster recovery, Ryland said. Disaster recovery users replicate their mission-critical data to AWS on a regularly scheduled basis, and configure their systems to run application workloads from that data on AWS if their on-premises systems fail.
Amazon Simple Storage Service (Amazon S3) provides multiple options for encryption of data at rest. For system administrators who prefer to manage their own encryption keys, Amazon allows them to use a client encryption library like the Amazon S3 Encryption Client to encrypt their data before uploading to Amazon S3. Alternatively, they can use Amazon S3 Server Side Encryption if they want Amazon S3 to manage encryption keys for them, Ryland said.
Amazon S3 lets users set up tight-security policies, giving them control over resources and provides data protection features, he noted.
Additionally, a growing number of organizations are replicating their scientific data to the cloud. Sharing of large datasets has long been a big problem for scientific researchers, but now data can be replicated to a single cloud storage system such as AWS, Ryland said. Then, other research teams can run analysis jobs either directly on the data, or quickly move the data into Amazon Elastic Compute Cloud (EC2) for additional processing.
“Another thing to consider regarding replicating data is optimizing for high performance,” Ryland said. For example, Amazon is working with a large U.S. government agency that is moving its website from a private datacenter in the Midwest to AWS. The agency wants to take advantage of AWS’s massive scale and high availability to quickly replicate data between AWS regions on the East and West coasts, Ryland said.
Agencies can also replicate their data in parallel at high speed. Amazon Elastic MapReduce lets users create compute clusters that transfer data into AWS in parallel. This lets users spread the replication workload across as many machines as they need to optimize for extremely fast data replication, Ryland said.