Overview
Watch a 54-minute lecture from the Simons Institute where Rebecca Willett from the University of Chicago explores algorithmic stability in machine learning and data science. Dive into how bagging techniques can stabilize prediction models regardless of input data, ensuring minimal changes when training data is modified. Learn about extending stability guarantees beyond traditional prediction modeling to broader statistical estimation problems. Discover a new framework that combines bagging on class or model weights with a stable "soft" version of the argmax operator for classification and model selection. Understand how stability serves as a fundamental principle for reliable data science, impacting generalization, cross-validation, and uncertainty quantification. Gain practical insights into determining optimal bag sizes for achieving desired stability levels in machine learning applications.
Syllabus
Off-the-shelf Algorithmic Stability
Taught by
Simons Institute