Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Analyzing Optimization and Generalization in Deep Learning via Dynamics of Gradient Descent

Simons Institute via YouTube

Overview

Explore the dynamics of gradient descent in deep learning through a comprehensive lecture that delves into optimization and generalization. Analyze the interplay between classical machine learning and deep learning approaches, focusing on gradient flow and end-to-end dynamics. Investigate implicit preconditioning, deep matrix factorization, and the dynamics of singular values in matrix completion problems. Gain insights into the role of nonlinearity in deep learning optimization and generalization, presented by Nadav Cohen from Tel-Aviv University as part of the "Learning and Testing in High Dimensions" series at the Simons Institute.

Syllabus

Introduction
What is Optimization Generalization
Classical Machine Learning
Deep Learning
Content
Gradient Flow
Endtoend Dynamics
Conventional Approach
Implicit Preconditioning
Gradient Descent
Depth
Matrix Completion
Deep Matrix Factorization
Experiments
Dynamics of Singular Values
Matrix Completion Problem
Singular Value Dynamics
Recap
Nonlinearity

Taught by

Simons Institute

Reviews

Start your review of Analyzing Optimization and Generalization in Deep Learning via Dynamics of Gradient Descent

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.