Differentially Private Learning Needs Hidden State or Much Faster Convergence
Harvard CMSA via YouTube
Overview
Watch a research presentation from the 2022 Symposium on Foundations of Responsible Computing where Jiayuan Ye from the National University of Singapore explores how differential privacy analysis of randomized learning algorithms can be improved through hidden state analysis. Learn about extending hidden-state analysis to various stochastic minibatch gradient descent schemes, including "shuffle and partition" and "sample without replacement" approaches. Discover how this analysis yields significantly smaller privacy bounds compared to traditional composition bounds when training with many iterations on high-dimensional data. Examine experimental results from training classification models on MNIST, FMNIST and CIFAR-10 datasets that demonstrate improved accuracy under fixed privacy budgets using hidden-state analysis.
Syllabus
Jiayuan Ye | Differentially Private Learning Needs Hidden State (or Much Faster Convergence)
Taught by
Harvard CMSA