Rapture of the Deep: Highs and Lows of Sparsity in Neural Networks

Rapture of the Deep: Highs and Lows of Sparsity in Neural Networks

Institut des Hautes Etudes Scientifiques (IHES) via YouTube Direct link

Optimization with support constraints

10 of 14

10 of 14

Optimization with support constraints

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Rapture of the Deep: Highs and Lows of Sparsity in Neural Networks

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Based on joint work with
  3. 3 Sparsity & frugality
  4. 4 Sparsity & interpretability
  5. 5 Deep sparsity?
  6. 6 Bilinear sparsity: blind deconvolution
  7. 7 ReLU network training - weight decay
  8. 8 Behind the scene
  9. 9 Greed is good?
  10. 10 Optimization with support constraints
  11. 11 Application: butterfly factorization
  12. 12 Wandering in equivalence classes
  13. 13 Other consequences of scale-invariance
  14. 14 Conservation laws

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.