Random Matrices and Dynamics of Optimization in Very High Dimensions - Lecture 1
Institut des Hautes Etudes Scientifiques (IHES) via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the dynamics of optimization in high-dimensional spaces through this comprehensive lecture on random matrices and machine learning algorithms. Delve into the effectiveness of simple tools like Stochastic Gradient Descent in complex, over-parameterized regimes. Learn about the framework for non-experts, covering the structure of typical tasks and neural networks used in standard contexts. Discover recent research on projected "effective dynamics" for summary statistics in smaller dimensions and their impact on high-dimensional system performance. Examine the critical regime and its role in defining finite-dimensional dynamical systems that govern learning algorithm performance. Investigate how systems identify these summary statistics through a dynamical spectral transition in Random Matrix Theory. Gain insights into the behavior of Gram and Hessian matrices along optimization paths, including the development of outliers carrying effective dynamics. Review essential Random Matrix Tools, focusing on spectrum edge behavior and the BBP transition. Apply these concepts to central machine learning examples, such as multilayer neural networks for classification of Gaussian mixtures and XOR problems.
Syllabus
Gérard Ben Arous - 1/4 Random Matrices and Dynamics of Optimization in Very High Dimensions
Taught by
Institut des Hautes Etudes Scientifiques (IHES)