Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Continual Learning and Catastrophic Forgetting

Paul Hand via YouTube

Overview

Explore continual learning and catastrophic forgetting in deep neural networks through this 42-minute lecture. Delve into the context, evaluation methods, and algorithms based on regularization, dynamic architectures, and Complementary Learning Systems. Examine data permutation tasks, incremental task learning, multimodal learning, Learning without Forgetting algorithm, Elastic Weight Consolidation, Progressive Neural Networks, and Generative replay. Gain insights from Northeastern University's CS 7150 Deep Learning course, with references to key research papers in the field. Access accompanying lecture notes for a comprehensive understanding of this crucial topic in machine learning.

Syllabus

Introduction
Context for continual learning
Training on new data
Catastrophic forgetting
Training from scratch
Replaying training data
Evaluating continual learning
Incremental class learning
Multimodal class learning
Strategies for continual learning
Regularization approaches
Learning without forgetting
Regularization
Elastic Weight Consolidation
Bayesian Learning Perspective
Progressive Neural Networks
generative replay
complementary learning systems

Taught by

Paul Hand

Reviews

5.0 rating, based on 1 Class Central review

Start your review of Continual Learning and Catastrophic Forgetting

  • Kemal Mudie Tosora
    Great course, which helps us understand how it differ from courses like reinforcement learning, imitation learning, federated learning and more. It is like how we humans accumulate knowledge though time. Hopefully continual learning would push the boundaries of AGI a lot.

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.