Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore an in-depth analysis of the research paper "Supermasks in Superposition" in this comprehensive video lecture. Delve into the concept of supermasks, binary masks of randomly initialized neural networks that perform well on specific tasks, and their application in lifelong learning. Learn how the system can automatically derive task IDs at inference time and distinguish up to 2500 tasks. Follow along as the lecture covers key topics including catastrophic forgetting, mask superpositions, binary maximum entropy search, and encoding masks in Hopfield networks. Gain insights into the paper's methodology, experiments, and conclusions, as well as potential applications and extensions of this innovative approach to sequential learning in neural networks.
Syllabus
- Intro & Overview
- Catastrophic Forgetting
- Supermasks
- Lifelong Learning using Supermasks
- Inference Time Task Discrimination by Entropy
- Mask Superpositions
- Proof-of-Concept, Task Given at Inference
- Binary Maximum Entropy Search
- Task Not Given at Inference
- Task Not Given at Training
- Ablations
- Superfluous Neurons
- Task Selection by Detecting Outliers
- Encoding Masks in Hopfield Networks
- Conclusion
Taught by
Yannic Kilcher