Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Random Matrices and Dynamics of Optimization in Very High Dimensions - Lecture 3

Institut des Hautes Etudes Scientifiques (IHES) via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the dynamics of optimization in high-dimensional spaces through this comprehensive lecture by Gérard Ben Arous. Delve into the world of machine learning and data science algorithms, focusing on the effectiveness of simple tools like Stochastic Gradient Descent in complex, over-parametrized regimes. Gain insights into the framework of typical tasks and neural network structures used in standard contexts. Examine the classical context of SGD in finite dimensions before surveying recent work on projected "effective dynamics" for summary statistics in smaller dimensions. Discover how these dynamics govern the performance of high-dimensional systems and define complex dynamical systems in finite dimensions. Investigate the process of finding summary statistics through a dynamical spectral transition in Random Matrix Theory, exploring the behavior of Gram and Hessian matrices along optimization paths. Apply these concepts to central examples in machine learning, including multilayer neural networks for classification of Gaussian mixtures and XOR examples.

Syllabus

Gérard Ben Arous - 3/4 Random Matrices and Dynamics of Optimization in Very High Dimensions

Taught by

Institut des Hautes Etudes Scientifiques (IHES)

Reviews

Start your review of Random Matrices and Dynamics of Optimization in Very High Dimensions - Lecture 3

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.