Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Flexible Neural Networks and the Frontiers of Meta-Learning

Simons Institute via YouTube

Overview

Explore the cutting-edge concepts of flexible neural networks and meta-learning in this 50-minute lecture by Chelsea Finn from Stanford University. Delve into the challenges of enabling agents to learn skills in the real world, focusing on few-shot image classification as a key example. Examine the meta-learning problem from both mechanistic and probabilistic perspectives, and understand how supervised learning relates to few-shot learning. Discover optimization-based inference techniques and learn how to leverage data from previous objects for quick adaptation to new ones. Gain insights into the practical implementation of FTML (Few-Shot Task-Agnostic Meta-Learning) and analyze experimental results in this thought-provoking talk from the Simons Institute's "Emerging Challenges in Deep Learning" series.

Syllabus

Intro
How can we enable agents to learn skills in the real world?
Example: Few-Shot Image Classification
The Meta-Learning Problem: The Mechanistic View
The Meta-Learning Problem: The Probabilistic View Supervised Learning
Few-Shot Learning
Optimization-Based Inference
Leverage data with previous objects to quickly adapt to new ones?
Practical instantiation of FTML
Experiments

Taught by

Simons Institute

Reviews

Start your review of Flexible Neural Networks and the Frontiers of Meta-Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.