Completed
Empirical tests: MNIST parity
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Loss Landscape and Performance in Deep Learning by Stefano Spigler
Automatically move to the next video in the Classroom when playback concludes
- 1 Loss Landscape and Performance in Deep Learning
- 2 Supervised Deep Learning
- 3 Set-up: Architecture
- 4 Set-up: Dataset
- 5 Learning
- 6 Learning dynamics = descent in loss landscape
- 7 Analogy with granular matter: Jamming
- 8 Theoretical results: Phase diagram
- 9 Empirical tests: MNIST parity
- 10 Landscape curvature
- 11 Flat directions
- 12 Outline
- 13 Overfitting?
- 14 Ensemble average
- 15 Fluctuations increase error
- 16 Scaling argument!
- 17 Infinitely-wide networks: Initialization
- 18 Infinitely-wide networks: Learning
- 19 Neural Tangent Kernel
- 20 Finite N asymptotics?
- 21 Conclusion