Completed
Applications: Parameter-free/Comparator Adaptive
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Differentially Private Online-to-Batch Conversion for Stochastic Optimization
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Privacy and Learning
- 3 Privacy Preserving Learning
- 4 Stochastic Optimization
- 5 Private Stochastic Convex Optimization
- 6 Typical Strategy 1: DP-SGD
- 7 Typical Strategy 2: Bespoke Analysis
- 8 Two techniques summary
- 9 High-level result
- 10 Outline of Strategy
- 11 Key Ingredient 1: Online (Linear) Optimization/Learning
- 12 Online Linear Optimization
- 13 Key Ingredient 2: Online to Batch Conversion
- 14 Straw man algorithm: Gaussian mechanism + online-to-batch
- 15 Key Ingredient 2: Anytime Online-to-Batch Conversion
- 16 Important Property of Anytime Online-to-Batch
- 17 Anytime vs Classic Sensitivity
- 18 Gradient as sum of gradient differences
- 19 Our actual strategy
- 20 Final Ingredient: Tree Aggregation
- 21 Final Algorithm
- 22 Loose Ends
- 23 Unpacking the bound
- 24 Applications: Adaptivity
- 25 Applications: Parameter-free/Comparator Adaptive
- 26 Fine Print, Open problems