Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Gradients Look Alike: Sensitivity is Often Overestimated in DP-SGD

USENIX via YouTube

Overview

Watch a research presentation from USENIX Security '24 exploring how differentially private stochastic gradient descent (DP-SGD) provides stronger privacy guarantees than previously thought for many datapoints in common benchmark datasets. Learn about a novel per-instance privacy analysis that demonstrates how points with similar neighbors in datasets enjoy better data-dependent privacy protection compared to outliers. Discover how researchers developed a new composition theorem to analyze entire training runs, formally proving that DP-SGD leaks significantly less information than indicated by current data-independent guarantees when training on standard benchmarks. Understand the implications for privacy attacks and how they may fail against many datapoints without sufficient adversarial control over training datasets.

Syllabus

USENIX Security '24 - Gradients Look Alike: Sensitivity is Often Overestimated in DP-SGD

Taught by

USENIX

Reviews

Start your review of Gradients Look Alike: Sensitivity is Often Overestimated in DP-SGD

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.