Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

The Value of Collaboration in Convex Machine Learning with Differential Privacy

IEEE via YouTube

Overview

Explore the value of collaboration in convex machine learning with differential privacy in this IEEE conference talk. Delve into the application of machine learning to distributed private data owned by multiple entities, utilizing noisy, differentially-private gradients to minimize fitness costs through stochastic gradient descent. Examine the trade-off between privacy and utility in machine learning by quantifying model quality as a function of privacy budget and dataset size. Discover how to predict collaboration outcomes among privacy-aware data owners before executing computationally-expensive algorithms. Learn about the inverse relationship between model fitness differences and dataset size and privacy budget. Validate performance predictions with practical applications in financial datasets, including interest rate determination for loans using regression and credit card fraud detection using support vector machines. Gain insights into training with differentially private gradients, the convergence of learning algorithms, and the future of privacy-aware machine learning.

Syllabus

Learning on mutiple datasets
State-of-the-art
Training ML models
Learning on private datasets
Training with DP gradients for general convex fitness functions Slower decaying learning rate
Value of collaboration
Experiment with loan data
Convergence of learning algorithm
Prediction vs reality
Conclusions and future work

Taught by

IEEE Symposium on Security and Privacy

Reviews

Start your review of The Value of Collaboration in Convex Machine Learning with Differential Privacy

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.