Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a Google TechTalk on a fast algorithm for adaptive private mean estimation, presented by John Duchi as part of the Privacy ML series. Delve into the design of an (ε,δ)-differentially private algorithm for estimating the mean of a d-variate distribution with unknown covariance Σ. Learn how this algorithm achieves optimal convergence rates with respect to the induced Mahalanobis norm, computes in O~(nd2) time, and offers near-linear sample complexity for sub-Gaussian distributions. Discover its ability to handle degenerate or low-rank Σ and adaptively extend beyond sub-Gaussianity. Understand the significance of this work in overcoming previous limitations of exponential computation time or superlinear scaling. Examine topics such as the Laplace Mechanism, truncation, Coin Press, the proposed Test Release Framework, and stable mean estimation. Access the related paper on arXiv for further insights into this groundbreaking approach to private mean estimation.
Syllabus
Introduction
The Problem
LaPlace Mechanism
Onedimensional mean
Truncation
Naive Case
Adaptive Mean Estimation
Coin Press
Proposed Test Release Framework
Two Phase Approach
Sample Covariance
Stable Mean Estimation
Building Blocks
Taught by
Google TechTalks