Explore a comprehensive lecture on classical statistical decision theory and its application to modern machine learning paradigms. Delve into the relationship between prediction error, generalization gap, and model complexity from a fixed-X perspective in statistics. Examine the insights this approach offers and its limitations when applied to random-X settings common in machine learning. Discover how classical statistical concepts can be reinterpreted and extended to address the challenges of flexible models that interpolate training data. Gain valuable knowledge on bridging the gap between traditional statistical methods and contemporary machine learning approaches in this 1-hour 18-minute talk by Ryan Tibshirani from the University of California, Berkeley, presented at the Simons Institute's Modern Paradigms in Generalization Boot Camp.
Overview
Syllabus
Prediction, Generalization, Complexity: Revisiting the Classical View from Statistics Part 1
Taught by
Simons Institute