Interaction Is Necessary for Distributed Learning with Privacy or Communication Constraints
Association for Computing Machinery (ACM) via YouTube
Overview
Syllabus
Interaction is necessary for distributed learning with privacy or communication constraints Yuval Dagan, MIT Vitaly Feldman, Apple Research was done while at Google Brain
Non interactive vs. interactive
Learning with local privacy requires interaction
Learning with communication constraints
Outline
Prior work on private stochastic convex optimization Interactive setting . Non interactive . Upper bound, exponential sample complexity
Lower bound: stochastic convex optimization of linear models
Prior work on locally private linear classification Sample bounds for non-interactive algorithms Daniely, Feldman 18
Lower bound: non-interactive private linear classification Setting: non-interactive ; locally private with parameter e = 1 Thm: The sample complexity is exp min(d, 1/7)(1)
Proof Techniques
The statistical query model (SQ) [Kearns 93] . Instead of independent samples, can ask for statistics about the data . For example: mean, moments, expected loss, gradient
Non interactive statistical queries
Two non-distinguishable distributions over {-1,1}d
Summary and open questions • Learning with non-interactive local privacy requires exponential sample complexity
Taught by
Association for Computing Machinery (ACM)