Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 45-minute video lecture on SynFlow, a groundbreaking algorithm for pruning neural networks without using any data. Delve into the concept of the Lottery Ticket Hypothesis and understand why previous pruning attempts have failed. Learn about layer collapse, synaptic saliency conservation, and how iterative pruning can avoid these issues. Discover the SynFlow algorithm, which achieves maximum compression capacity by conserving synaptic flow. Examine experimental results and gain insights into this data-agnostic approach that challenges the notion that data is necessary to determine important synapses in neural networks.
Syllabus
- Intro & Overview
- Pruning Neural Networks
- Lottery Ticket Hypothesis
- Paper Story Overview
- Layer Collapse
- Synaptic Saliency Conservation
- Connecting Layer Collapse & Saliency Conservation
- Iterative Pruning avoids Layer Collapse
- The SynFlow Algorithm
- Experiments
- Conclusion & Comments
Taught by
Yannic Kilcher