Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Pruning and Sparsity in Neural Networks - Lecture 4

MIT HAN Lab via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive deeper into neural network pruning and sparsity in this lecture from MIT's course on TinyML and Efficient Deep Learning Computing. Explore advanced pruning techniques, including how to select optimal pruning ratios for each layer and fine-tune sparse neural networks. Discover the lottery ticket hypothesis and learn about system support for sparsity. Gain valuable insights into making deep learning models more efficient and deployable on resource-constrained devices. Access accompanying slides and additional course materials to enhance your understanding of pruning, sensitivity scans, automatic pruning, and the AMC algorithm.

Syllabus

Lecture 04 - Pruning and Sparsity (Part II) | MIT 6.S965

Taught by

MIT HAN Lab

Reviews

Start your review of Pruning and Sparsity in Neural Networks - Lecture 4

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.