Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Massachusetts Institute of Technology

The Lottery Ticket Hypothesis - Michael Carbin

Massachusetts Institute of Technology via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the groundbreaking "Lottery Ticket Hypothesis" in neural network pruning through this seminar by Michael Carbin from MIT. Delve into techniques for reducing parameter counts in trained networks by over 90% without compromising accuracy. Discover how iterative magnitude pruning uncovers subnetworks capable of effective training from early stages. Learn about the potential for more efficient machine learning methods, including inference, fine-tuning pre-trained networks, and sparse training. Gain insights into the semantics, design, and implementation of systems operating under uncertainty in environment, implementation, or execution. Follow the journey from background on network pruning to current understanding and implications for future research in this comprehensive exploration of sparse, trainable neural networks.

Syllabus

Intro
Neural Networks are Large
Background: Network Pruning
Training is Expensive
Research Question
Motivation and Questions
Training Pruned Networks
Iterative Magnitude Pruning
Results
The Lottery Ticket Hypothesis
Broader Questions
Larger-Scale Settings
Scalability Challenges
Linear Mode Connectivity
Instability
Rewinding IMP Works
Takeaways
Our Current Understanding
Implications and Follow-Up

Taught by

MIT Embodied Intelligence

Reviews

Start your review of The Lottery Ticket Hypothesis - Michael Carbin

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.