Leveraging Differential Privacy While Attending to Social and Political Commitments
- Kohli, Nitin
- Advisor(s): Mulligan, Deirdre
Abstract
Public and private organizations alike compute statistics from personal data. These statistics are used for a wide array of purposes, such as informing federal, state, and local policymaking, distributing humanitarian aid during crisis situations, and informing the public about social, economic, and health trends in the population. However, statistical uses of personal data can harm individual privacy. Modern privacy research has demonstrated that personal data can be reconstructed from statistics, and that statistical summaries can be analyzed to identify the presence of specific individuals in the underlying data. To attenuate these informational privacy concerns, robust privacy protections are required.
Differential privacy is a mathematical definition of privacy that enables statistical learning while protecting the privacy of individuals in the underlying data. However, in real world applications, other conceptions of privacy, and values beyond privacy, are also at stake.
In this dissertation, we examine how differentially private algorithms can be used to compute statistics while attending to context specific social and political commitments. This inquiry is informed by two studies, both of which examine this question in distinct ways. In our first study, we consider ways to configure differentially private algorithms based on the privacy and accuracy concerns of data subjects and other stakeholders, all the while attending to the values of privacy, fairness, and strategyproofness. Our second study considers the ways in which differentially private algorithms can be configured to attend to context-specific values in public health. We do so through a case study amidst the backdrop of the COVID-19 pandemic, where mobility statistics derived from mobile phone metadata are used to inform decisions on lockdown mandates and other non-pharmaceutical interventions.
Throughout these studies, we develop theory and tools to effectively leverage differential privacy in practice. In addition to the mathematical results in this dissertation, we provide strategies and guidelines to effectively aid in the development and deployment of differentially private algorithms. This work serves as a guide for researchers, analysts, developers, and policymakers on how to use mathematical approaches to attend to context specific social and political commitments, including privacy, in areas where statistical knowledge is essential to advancing the public good.