Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the frontiers of non-convex optimization in machine learning through this illuminating lecture by Princeton University's Tengyu Ma. Delve into recent breakthroughs in deep learning and the challenges of analyzing complex, high-dimensional models trained on massive datasets. Discover new algorithmic approaches and analysis tools for non-convex methods, including insights on matrix completion solved by stochastic gradient descent. Examine the landscape of objective functions for linearized recurrent neural networks and residual networks, and learn how over-parameterization and re-parameterization can simplify optimization processes. Gain valuable knowledge about the formal study of non-convex methods and their applications in advancing machine learning techniques.
Syllabus
Allen School Colloquia: Tengyu Ma (Princeton University)
Taught by
Paul G. Allen School