On the Variance and Admissibility of Empirical Risk Minimization on Convex Classes
Hausdorff Center for Mathematics via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the concept of empirical risk minimization (ERM) in estimating unknown functions from noisy samples in this 51-minute lecture by Eli Putterman at the Hausdorff Center for Mathematics. Delve into the challenges of minimizing expected error when estimating functions belonging to a known class. Examine why ERM, despite its intuitive appeal, can be minimax suboptimal for certain function classes. Discover recent findings showing that ERM's variance is always minimax optimal under mild assumptions, implying that suboptimality must stem from bias. Learn about the proof technique involving concentration of measure for Lipschitz functions on Gauss space. If time allows, gain insights into how these results provide a new proof for Chatterjee's theorem on ERM's admissibility as an estimator. This talk, based on joint work with Gil Kur and Alexander Rakhlin, provides all necessary statistical background for a comprehensive understanding of the topic.
Syllabus
Eli Putterman: On the variance and admissibility of empirical risk minimization on convex classes
Taught by
Hausdorff Center for Mathematics