Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

A Bregman Learning Framework for Sparse Neural Networks

Society for Industrial and Applied Mathematics via YouTube

Overview

Explore a cutting-edge learning framework for sparse neural networks in this virtual seminar talk from the 30th Imaging & Inverse Problems (IMAGINE) OneWorld SIAM-IS series. Delve into Leon Bungert's presentation on a novel approach using stochastic Bregman iterations, which enables training of sparse neural networks through an inverse scale space method. Learn about the baseline LinBreg algorithm, its accelerated momentum version, and AdaBreg, a Bregmanized generalization of the Adam algorithm. Discover a statistically sound sparse parameter initialization strategy and gain insights into stochastic convergence analysis of loss decay, along with additional convergence proofs in the convex regime. Understand how this Bregman learning framework can be applied to Neural Architecture Search, potentially uncovering autoencoder architectures for denoising or deblurring tasks.

Syllabus

30th Imaging & Inverse Problems (IMAGINE) OneWorld SIAM-IS Virtual Seminar Series Talk

Taught by

Society for Industrial and Applied Mathematics

Reviews

Start your review of A Bregman Learning Framework for Sparse Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.