Data Science Seminar
Hosted by the Department of Mathematical Sciences
In this presentation, we consider a new mechanism for achieving Differential Privacy called the K-Norm Gradient Mechanism, KNG. Two of the most popular methods, the exponential mechanism and objective perturbation, have had great success, but still have several drawbacks that can range from minor to severe depending on the setting. Recently it was shown that the exponential mechanism is not asymptotically efficient, introducing too much noise, and thus reducing statistical utility quite broadly. Conversely, objective perturbation enjoys excellent utility but can be difficult to generalize and requires strong structural assumptions. We show how our new approach, KNG, assuages nearly all of these issues; it is nearly as easy to implement as the exponential mechanism but has much better asymptotic properties. We highlight how KNG agrees with well-known mechanisms in simpler settings, while using its framework to develop new privacy tools in more complicated settings such as linear and quantile regression.
Biography of the speaker: Dr. Reimherr is an Associate Professor of Statistics at Penn State University. He obtained his PhD in Statistics from the University of Chicago in 2013. He has an extensive research background in functional data analysis, for which he was awarded the Noether Young Scholar award from the ASA. A great deal of his recent work focuses on data privacy with a particular emphasis on nonparametric statistics and functional data analysis.