Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Surprising Phenomena of Max-LP-Margin Classifiers in High Dimensions

Institute for Pure & Applied Mathematics (IPAM) via YouTube

Overview

Explore a 53-minute conference talk by Fanny Yang from ETH Zurich on "Surprising phenomena of max-lp-margin classifiers in high dimensions" presented at IPAM's Theory and Practice of Deep Learning Workshop. Delve into the analysis of max-lp-margin classifiers, examining their relevance to the implicit bias of first-order methods and harmless interpolation in neural networks. Discover unexpected findings in the noiseless case, where minimizing l1-norm achieves optimal rates for regression with hard-sparse ground truths, but this adaptivity doesn't directly apply to max l1-margin classification. Investigate how max-lp-margin classifiers can achieve 1/√n rates for p slightly larger than one in noisy observations, while maximum l1-margin classifiers only achieve rates of order 1/√(log(d/n)). Gain insights into cutting-edge research in machine learning theory and its implications for deep learning practices.

Syllabus

Fanny Yang - Surprising phenomena of max-lp-margin classifiers in high dimensions - IPAM at UCLA

Taught by

Institute for Pure & Applied Mathematics (IPAM)

Reviews

Start your review of Surprising Phenomena of Max-LP-Margin Classifiers in High Dimensions

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.