Explore the fundamental concepts of generalization error and stability in statistical learning theory through this comprehensive lecture by Lorenzo Rosasco from MIT, University of Genoa, and IIT. Delve into topics such as excess risk, universal consistency, empirical risk minimization, law of large numbers, and union bound. Gain insights into the rewriting process and understand how stability plays a crucial role in machine learning algorithms. This in-depth presentation, part of MIT's 9.520/6.860S Statistical Learning Theory and Applications course, offers valuable knowledge for students and professionals seeking to enhance their understanding of advanced machine learning concepts.
Overview
Syllabus
Recap
Excess Risk
Universal Consistency
empirical risk minimization
law of large number
Union bound
Stability
Rewriting
Taught by
MITCBMM