Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Private Convex Optimization via Exponential Mechanism - Differential Privacy for Machine Learning

Google TechTalks via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore private convex optimization through the exponential mechanism in this Google TechTalk presented by Daogao Liu. Delve into differential privacy for machine learning, covering topics such as noisy stochastic gradient descent and the regularized exponential mechanism. Examine isoperimetric inequality for strongly log-concave measures and concentration bounds for Lipschitz functions. Learn about DP-Stochastic Convex Optimization and its intuition. Discover new sampling algorithms and their applications in DP-ERM and DP-SCO. Gain insights into bounding generalization error, Wasserstein distance, KL divergence, and population loss. Understand the contributions and open problems in this field of private convex optimization.

Syllabus

Intro
One-sentence Summary
Differential Privacy
Noisy SGD
Regularized Exponential Mechanism (RegEM)
Isoperimetric Inequality for strongly log- concave measures
Concentration bounds for Lipschitz functions
Proof Sketch
Utility Analysis
A Question from the Duck
DP-Stochastic Convex Optimization (SCO)
Intuition
Open Problems
RegEM Revisited
Bounding Generalization Error
Bound Wasserstein Distance
Bounding KL. divergence
Bounding Population Loss
Summary of Contributions
A new sampling algorithm
Algorithms for DP-ERM and DP-SCO

Taught by

Google TechTalks

Reviews

Start your review of Private Convex Optimization via Exponential Mechanism - Differential Privacy for Machine Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.