Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Rapture of the Deep: Highs and Lows of Sparsity in Neural Networks

Institut des Hautes Etudes Scientifiques (IHES) via YouTube

Overview

Explore the depths of sparsity in neural networks through this 37-minute conference talk by Remi Gribonval from INRIA, hosted by the Institut des Hautes Etudes Scientifiques (IHES). Delve into the natural promotion of sparse connections in neural networks for complexity control and potential interpretability guarantees. Compare classical sparse regularization for inverse problems with multilayer sparse approximation. Discover the role of rescaling-invariances in deep parameterizations, their advantages and challenges. Learn about life beyond gradient descent, including an algorithm that significantly speeds up learning of certain fast transforms via multilayer sparse factorization. Cover topics such as bilinear sparsity, blind deconvolution, ReLU network training with weight decay, optimization with support constraints, butterfly factorization, and the consequences of scale-invariance in neural networks.

Syllabus

Intro
Based on joint work with
Sparsity & frugality
Sparsity & interpretability
Deep sparsity?
Bilinear sparsity: blind deconvolution
ReLU network training - weight decay
Behind the scene
Greed is good?
Optimization with support constraints
Application: butterfly factorization
Wandering in equivalence classes
Other consequences of scale-invariance
Conservation laws

Taught by

Institut des Hautes Etudes Scientifiques (IHES)

Reviews

Start your review of Rapture of the Deep: Highs and Lows of Sparsity in Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.