Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore continual learning and catastrophic forgetting in deep neural networks through this 42-minute lecture. Delve into the context, evaluation methods, and algorithms based on regularization, dynamic architectures, and Complementary Learning Systems. Examine data permutation tasks, incremental task learning, multimodal learning, Learning without Forgetting algorithm, Elastic Weight Consolidation, Progressive Neural Networks, and Generative replay. Gain insights from Northeastern University's CS 7150 Deep Learning course, with references to key research papers in the field. Access accompanying lecture notes for a comprehensive understanding of this crucial topic in machine learning.
Syllabus
Introduction
Context for continual learning
Training on new data
Catastrophic forgetting
Training from scratch
Replaying training data
Evaluating continual learning
Incremental class learning
Multimodal class learning
Strategies for continual learning
Regularization approaches
Learning without forgetting
Regularization
Elastic Weight Consolidation
Bayesian Learning Perspective
Progressive Neural Networks
generative replay
complementary learning systems
Taught by
Paul Hand
Reviews
5.0 rating, based on 1 Class Central review
Showing Class Central Sort
-
Great course, which helps us understand how it differ from courses like reinforcement learning, imitation learning, federated learning and more. It is like how we humans accumulate knowledge though time. Hopefully continual learning would push the boundaries of AGI a lot.