Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Tightening Information-Theoretic Generalization Bounds with Data-Dependent Estimates with an Application to SGLD - Daniel Roy

Institute for Advanced Study via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a workshop presentation on tightening information-theoretic generalization bounds with data-dependent estimates, focusing on their application to Stochastic Gradient Langevin Dynamics (SGLD). Delve into the nature of generalization understanding, open problems, and barriers in the field. Examine non-vacuous bounds, stochastic gradient dynamics, and expected generalization error. Learn from Daniel Roy of the University of Toronto as he discusses these advanced concepts in deep learning theory, providing insights into current challenges and potential future directions in the field.

Syllabus

Intro
The nature of generalization understanding
Open problem
Barriers
Non vacuous bounds
Stochastic gradient dynamics
Expected generalization error
Plot
Conclusion

Taught by

Institute for Advanced Study

Reviews

Start your review of Tightening Information-Theoretic Generalization Bounds with Data-Dependent Estimates with an Application to SGLD - Daniel Roy

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.