Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Differentially Private Learning Needs Hidden State or Much Faster Convergence

Harvard CMSA via YouTube

Overview

Watch a research presentation from the 2022 Symposium on Foundations of Responsible Computing where Jiayuan Ye from the National University of Singapore explores how differential privacy analysis of randomized learning algorithms can be improved through hidden state analysis. Learn about extending hidden-state analysis to various stochastic minibatch gradient descent schemes, including "shuffle and partition" and "sample without replacement" approaches. Discover how this analysis yields significantly smaller privacy bounds compared to traditional composition bounds when training with many iterations on high-dimensional data. Examine experimental results from training classification models on MNIST, FMNIST and CIFAR-10 datasets that demonstrate improved accuracy under fixed privacy budgets using hidden-state analysis.

Syllabus

Jiayuan Ye | Differentially Private Learning Needs Hidden State (or Much Faster Convergence)

Taught by

Harvard CMSA

Reviews

Start your review of Differentially Private Learning Needs Hidden State or Much Faster Convergence

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.