Rapture of the Deep: Highs and Lows of Sparsity in Neural Networks
Institut des Hautes Etudes Scientifiques (IHES) via YouTube
Overview
Syllabus
Intro
Based on joint work with
Sparsity & frugality
Sparsity & interpretability
Deep sparsity?
Bilinear sparsity: blind deconvolution
ReLU network training - weight decay
Behind the scene
Greed is good?
Optimization with support constraints
Application: butterfly factorization
Wandering in equivalence classes
Other consequences of scale-invariance
Conservation laws
Taught by
Institut des Hautes Etudes Scientifiques (IHES)