Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a groundbreaking system for scalable privacy-preserving machine learning in this IEEE conference talk. Delve into new and efficient protocols for privacy-preserving linear regression, logistic regression, and neural network training using stochastic gradient descent. Discover innovative techniques for secure arithmetic operations on shared decimal numbers and MPC-friendly alternatives to nonlinear functions. Learn how this system, implemented in C++, outperforms state-of-the-art implementations for privacy-preserving linear and logistic regressions, scaling to millions of data samples with thousands of features. Gain insights into the first privacy-preserving system for training neural networks, addressing the critical balance between data utility and privacy concerns in modern machine learning applications.
Syllabus
Intro
Privacy-preserving Machine Learning
Our Contributions
Privacy-preserving Linear Regression
Decimal Multiplications in Integer Fields
Truncation on shared values
Effects of Our Technique
Privacy-preserving Logistic Regression
Experiments Results: Linear Regression
Experiments Results: Logistic Regression
Experiments: Neural Networks
Summary
Taught by
IEEE Symposium on Security and Privacy