Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Learning and Generalization in Over-parametrized Neural Networks - Going Beyond Kernels

Simons Institute via YouTube

Overview

Explore the frontiers of deep learning in this 50-minute lecture by Yuanzhi Li from Stanford University. Delve into the complexities of learning and generalization in over-parametrized neural networks, going beyond traditional kernel methods. Examine key concepts such as tangent kernels, learning algorithms, and generalization theorems. Gain insights into why neural networks are effective and understand the intuitions behind their complexity. Discover the potential future directions of this field and learn how to feed better neural networks. This talk, part of the Frontiers of Deep Learning series at the Simons Institute, offers a comprehensive overview of cutting-edge research in neural network theory and practice.

Syllabus

Introduction
Tangent Kernel
Theorem
Intuition
Takeaway
Why Neural Networks
Results
Learning Network
Concept Class
Learning Algorithms
Generalization Theorem
Complexity
Extensions
Intuitions
Higher Level Intuition
Key Message
Feed Better
Neural Network
Future Direction

Taught by

Simons Institute

Reviews

Start your review of Learning and Generalization in Over-parametrized Neural Networks - Going Beyond Kernels

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.