Completed
Points and lessons
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
From Classical Statistics to Modern ML - The Lessons of Deep Learning - Mikhail Belkin
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Empirical Risk Minimization
- 3 The ERM/SRM theory of learning
- 4 Uniform laws of large numbers
- 5 Capacity control
- 6 U-shaped generalization curve
- 7 Does interpolation overfit?
- 8 Interpolation does not overfit even for very noisy data
- 9 why bounds fail
- 10 Interpolation is best practice for deep learning
- 11 Historical recognition
- 12 where we are now: the key lesson
- 13 Generalization theory for interpolation?
- 14 Interpolated k-NN schemes
- 15 Interpolation and adversarial examples
- 16 "Double descent" risk curve
- 17 Random Fourier networks
- 18 what is the mechanism?
- 19 Is infinite width optimal?
- 20 Smoothness by averaging
- 21 Double Descent in Random Feature settings
- 22 Framework for modern ML
- 23 The landscape of generalization
- 24 optimization: classical
- 25 The power of interpolation
- 26 Learning from deep learning: fast and effective kernel machines
- 27 Points and lessons