Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Reconciling Modern Machine Learning and the Bias-Variance Trade-Off

Yannic Kilcher via YouTube

Overview

Explore a thought-provoking video lecture that challenges the traditional understanding of generalization in machine learning. Delve into the concept of the "double descent" risk curve, which extends beyond the classic bias-variance trade-off. Discover how highly complex models like deep neural networks can achieve good out-of-sample accuracy despite having nearly zero training error. Learn about the "interpolation threshold" and its significance in modern machine learning practices. Examine the implications of this new perspective on various models, including neural networks and random forests. Gain insights into the mechanisms behind this phenomenon and its potential impact on the field of machine learning.

Syllabus

Introduction
Example
Overfitting
Interpolation Threshold
Random Fourier Features
Conclusion

Taught by

Yannic Kilcher

Reviews

Start your review of Reconciling Modern Machine Learning and the Bias-Variance Trade-Off

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.