Completed
Learning with communication constraints
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Interaction Is Necessary for Distributed Learning with Privacy or Communication Constraints
Automatically move to the next video in the Classroom when playback concludes
- 1 Interaction is necessary for distributed learning with privacy or communication constraints Yuval Dagan, MIT Vitaly Feldman, Apple Research was done while at Google Brain
- 2 Non interactive vs. interactive
- 3 Learning with local privacy requires interaction
- 4 Learning with communication constraints
- 5 Outline
- 6 Prior work on private stochastic convex optimization Interactive setting . Non interactive . Upper bound, exponential sample complexity
- 7 Lower bound: stochastic convex optimization of linear models
- 8 Prior work on locally private linear classification Sample bounds for non-interactive algorithms Daniely, Feldman 18
- 9 Lower bound: non-interactive private linear classification Setting: non-interactive ; locally private with parameter e = 1 Thm: The sample complexity is exp min(d, 1/7)(1)
- 10 Proof Techniques
- 11 The statistical query model (SQ) [Kearns 93] . Instead of independent samples, can ask for statistics about the data . For example: mean, moments, expected loss, gradient
- 12 Non interactive statistical queries
- 13 Two non-distinguishable distributions over {-1,1}d
- 14 Summary and open questions • Learning with non-interactive local privacy requires exponential sample complexity