Privacy enhancing technology for data analysis
Connecting state and local government leaders
A new guide aims to help privacy officers get a better understanding of privacy enhancing technology as it applies to data science research and digital government strategies.
As more government agencies appoint chief privacy officers, the importance of privacy-enhancing technology (PET) is also coming to the fore. They allow agencies to take advantage of the increasing amount of data available for analysis while ensuring personal or sensitive information stays private.
There are many reasons why PETs are used. They help provide secure access to private datasets, enable joint analysis of data by several organizations and allow secure outsourcing of data to the cloud for computation.
To help privacy officers get a better understand of PET and help inform privacy protection policies related to data science research and digital government strategies, the Royal Society has released a new report that provides an overview of five current and promising PETs, including their respective readiness levels and case studies
"Protecting privacy in practice: The current use, development and limits of Privacy Enhancing Technologies in data analysis" focuses on five basic PETs the Royal Society identified as being particularly promising for privacy-aware data collection, analysis and dissemination:
Homomorphic encryption schemes make it possible to run computations on encrypted data without decrypting it. A user can encrypt data, send it to the cloud for processing and get the results of the computation back -- after which the data can be decrypted.
Trusted execution environments provide a hardware-based solution that isolates data and code so that they cannot be read by the operating system or hypervisor involved in running the computations.
Secure multiparty computation enables private distributed computations on combined data without the separate parties sharing the data.
Differential privacy, which addresses privacy in disclosure rather than in computation, adds noise to the data so that when a statistic is released, information about an individual is not revealed.
Personal data stores refer to consumer apps and services that are supported by different kinds of PETs and enable people to have more control over their data.
The report also includes a summary chart that shows the types of risk addressed, the kind of data protected, benefits, limitations and readiness level -- research, pilot or product -- for each type of PET so readers can compare the approaches. Each PET is illustrated by a case study.
One of the case studies cited is the Census Bureau's use of differential privacy and its plan to deploy it in its statistical analyses of the 2020 population count.
PETs are gaining traction elsewhere in government.
A partnership among the Allegheny County, Pa., Department of Human Services, the Bipartisan Policy Center and Galois, a tech research and development company, is using secure multiparty computation to improve data analysis while protecting individual privacy when they analyze data on social services clients from different agencies. That project also deployed a trusted execution environment to protect the data.
The Intelligence Advanced Research Projects Activity's Homomorphic Encryption Computing Techniques with Overhead Reduction (HECTOR) program, launched in 2017, aims to develop a “comprehensive set” of cryptographic tools, programming languages, design and verification tools that architects and programmers can use to take advantage of the challenge-based funding approach seems particularly suited to bridge gaps between theory and practice.
In Travis County, Texas, officials are using STAR-Vote, which stands for Secure, Transparent, Auditable, and Reliable, to secure election results. It uses homomorphic encryption to encode each vote "then publishes an online list of encrypted results by voter in a form that allows anyone from an election-monitoring organization to individual voters themselves to check the results," according to a report in Wired.
Read the full Royal Society report here.