Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Studying Generalization in Deep Learning via PAC-Bayes

Simons Institute via YouTube

Overview

Explore the intersection of generalization theory and deep learning in this 45-minute lecture from the Frontiers of Deep Learning series. Delve into PAC-Bayes theory and its applications to risk bounds for Gibbs classifiers and deterministic classifiers. Examine distribution-dependent approximations of optimal priors, the role of privacy, and the use of SGD to predict SGD. Investigate data and distribution priors for neural networks, focusing on MNIST results with coupled data-dependent priors and posteriors. Analyze bounds with oracle covariance and ghost samples, comparing results across different sample sizes. Gain insights into the potential contributions of generalization theory to deep learning and the challenges in explaining generalization.

Syllabus

Intro
What might generalization theory offer deep learning?
Barriers to explaining generalization
PAC-Bayes yields risk bounds for Gibbs classifiers
PAC-Bayes generalization bounds
PAC-Bayes bounds on deterministic classifiers
Distribution-dependent approximations of optimal priors via privacy
A question of interpretation
Use SGD to predict SGD
Data and distribution priors for neural networks
MNIST Results - Coupled data dependent priors and posteriors
Oracle access to optimal prior covariance
Bounds with oracle covariance + ghost sample
Bounds on 32k samples v 64k samples
Recap and Conclusion

Taught by

Simons Institute

Reviews

Start your review of Studying Generalization in Deep Learning via PAC-Bayes

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.