Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive deeper into neural network pruning and sparsity in this lecture from MIT's course on TinyML and Efficient Deep Learning Computing. Explore advanced pruning techniques, including how to select optimal pruning ratios for each layer and fine-tune sparse neural networks. Discover the lottery ticket hypothesis and learn about system support for sparsity. Gain valuable insights into making deep learning models more efficient and deployable on resource-constrained devices. Access accompanying slides and additional course materials to enhance your understanding of pruning, sensitivity scans, automatic pruning, and the AMC algorithm.
Syllabus
Lecture 04 - Pruning and Sparsity (Part II) | MIT 6.S965
Taught by
MIT HAN Lab