Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the foundations of statistical learning theory in this comprehensive lecture from the Modern Paradigms in Generalization Boot Camp. Delve into classical 20th-century concepts, focusing on generalization through capacity control. Examine the Vapnik and Chervonenkis Fundamental Theorem of Learning, scale-sensitive capacity control and marking, and Minimum Description Length principles. Investigate parallels with stochastic optimization and explore generalization from an optimization perspective, including online-to-batch conversion, stochastic approximation, and boosting. Analyze how classic theory relates to current interests such as interpolation learning, benign overfitting, and implicit bias. Gain valuable insights from Nati Srebro of the Toyota Technological Institute at Chicago in this 1-hour 17-minute presentation hosted by the Simons Institute.