Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

From Classical Statistics to Modern Machine Learning

Simons Institute via YouTube

Overview

Explore the evolution from classical statistics to modern machine learning in this 50-minute lecture by Mikhail Belkin from The Ohio State University. Delve into supervised ML, generalization bounds, and the classical U-shaped generalization curve. Examine the concept of interpolation in deep learning, addressing whether it leads to overfitting and its effectiveness even with noisy data. Investigate the "double descent" risk curve, its mechanisms, and implications for linear regression. Analyze the landscape of generalization, optimization under interpolation, and the power of interpolation in modern ML techniques. Gain insights into fast and effective kernel machines inspired by deep learning, and understand the key points in the transition from classical statistical approaches to contemporary machine learning methodologies.

Syllabus

Intro
Supervised ML
Generalization bounds
Classical U-shaped generalization curve
Does interpolation overfit?
Interpolation does not overfit even for very noisy data
Deep learning practice
Generalization theory for interpolation?
A way forward?
Interpolated k-NN schemes
Interpolation and adversarial examples
"Double descent" risk curve
what is the mechanism?
Double Descent in Linear regression
Occams's razor
The landscape of generalization
where is the interpolation threshold?
Optimization under interpolation
SGD under interpolation
The power of interpolation
Learning from deep learning: fast and effective kernel machines
Important points
From classical statistics to modern ML

Taught by

Simons Institute

Reviews

Start your review of From Classical Statistics to Modern Machine Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.