Overview
Explore the intricacies of deep learning optimization and generalization through an examination of gradient descent trajectories in this 46-minute lecture by Nadav Cohen from the Institute for Advanced Study. Delve into the Frontiers of Deep Learning as part of the Simons Institute series, gaining insights into the complex interplay between optimization techniques and the ability of deep learning models to generalize effectively. Uncover the mathematical foundations and practical implications of gradient descent trajectories, enhancing your understanding of how these factors contribute to the performance and reliability of deep learning systems.
Syllabus
Analyzing Optimization and Generalization in Deep Learning via Trajectories of Gradient Descent
Taught by
Simons Institute