Completed
ReLU network training - weight decay
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Rapture of the Deep: Highs and Lows of Sparsity in Neural Networks
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Based on joint work with
- 3 Sparsity & frugality
- 4 Sparsity & interpretability
- 5 Deep sparsity?
- 6 Bilinear sparsity: blind deconvolution
- 7 ReLU network training - weight decay
- 8 Behind the scene
- 9 Greed is good?
- 10 Optimization with support constraints
- 11 Application: butterfly factorization
- 12 Wandering in equivalence classes
- 13 Other consequences of scale-invariance
- 14 Conservation laws