Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Meta-Learning - Why It’s Hard and What We Can Do - Ke Li

Institute for Advanced Study via YouTube

Overview

Explore the challenges and potential solutions in meta-learning through this comprehensive seminar on theoretical machine learning. Delve into high-parameter optimization, automatic model selection, and program induction as Ke Li, a Member of the School of Mathematics at the Institute for Advanced Study, presents "Meta-Learning: Why It's Hard and What We Can Do." Gain insights into design considerations, proof frameworks, and optimization-based mental learning. Examine objective functions, methods for preventing overfitting, and forward dynamics in both deterministic and stochastic settings. Understand the original and new formulations of meta-learning problems, with practical examples and illustrations. Investigate the role of neural networks, parameter intervals, and gradients in meta-learning experiments. Learn about empirical learning techniques and strategies to improve and accelerate the learning process.

Syllabus

Introduction
HighParameter Optimization
Automatic Model Selection
Program Induction
Design Considerations
Proof
Framework
Optimization Based Mental Learning
Objective Functions
Preventing Overfitting
Forward Dynamics
Uncertainty
Forward Dynamic Stochastic
Original Formulation
New Formulation
Expectation
Setting
Example
Quick Question
Update Formula
NeuralNets
Experiments
Parameters
Intervals
Gradients
Illustration
Outputs
empirical learning
correct yourself
speed up

Taught by

Institute for Advanced Study

Reviews

Start your review of Meta-Learning - Why It’s Hard and What We Can Do - Ke Li

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.