Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Implicit Regularization via Uniform Convergence in Linear Models and Kernel Methods

Harvard CMSA via YouTube

Overview

Watch a 55-minute lecture from Harvard CMSA's GRAMSIA series where Oxford's Patrick Rebeschini explores the relationship between uniform convergence and implicit regularization in learning algorithms. Delve into the statistical analysis of early-stopped mirror descent applied to unregularized empirical risk with squared loss for linear models and kernel methods. Discover how the potential-based analysis of mirror descent from optimization theory connects to uniform learning, and learn how the path traced by mirror descent can be characterized through localized Rademacher complexities. Understand how these complexities are influenced by various factors including mirror map selection, initialization point, step size, and iteration count.

Syllabus

Patrick Rebeschini | Implicit regularization via uniform convergence

Taught by

Harvard CMSA

Reviews

Start your review of Implicit Regularization via Uniform Convergence in Linear Models and Kernel Methods

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.