Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

The Elusive Generalization: Classical Bounds to Double Descent to Grokking

Simons Institute via YouTube

Overview

Explore the evolution of generalization in machine learning through this comprehensive lecture by Misha Belkin from the University of California, San Diego. Delve into recent developments that have challenged traditional theoretical foundations, including empirical findings in neural networks. Examine the limitations of using training loss as a proxy for test loss and understand the implications of phenomena such as interpolation and double descent. Investigate the practice of early stopping and its potential shortcomings in light of emergent phenomena like grokking. Analyze the fundamental challenges these discoveries present to both the theory and practice of machine learning, and gain insights into the current state of understanding in the field.

Syllabus

The elusive generalization: classical bounds to double descent to grokking

Taught by

Simons Institute

Reviews

Start your review of The Elusive Generalization: Classical Bounds to Double Descent to Grokking

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.