Is Optimization the Right Language to Understand Deep Learning? - Sanjeev Arora
Institute for Advanced Study via YouTube
Overview
Explore a thought-provoking lecture on deep learning theory delivered by Princeton University's Sanjeev Arora at the Institute for Advanced Study. Delve into the question of whether optimization is the most appropriate framework for understanding deep learning. Examine key concepts including generalization, first-order optimization, and the Neural Tangent Kernel (NTK). Investigate the training of infinitely wide deep nets, kernel linear regression, and matrix completion. Analyze deep linear networks, learning rates, and formal statements related to connectivity. Gain insights into the current state and future directions of deep learning theory through this comprehensive exploration of optimization's role in understanding neural networks.
Syllabus
Intro
What is optimization
Generalization
First Order Optimization
Training of infinitely wide deep nets
Neural Tangent Kernel NTK
Neural Tangent Kernel Details
Kernel Linear Regression
Matrix Completion
Matrix Inflation
Deep Linear Net
Great in the Sense
Learning Rates
Formal Statements
Connectivity
Conclusions
Taught by
Institute for Advanced Study