Explore the intricacies of optimization algorithms in this 48-minute seminar from GERAD Research Center. Delve into the urban legend surrounding the "complexity lower bound" for strongly convex functions with Lipschitz gradients. Revisit Polyak's original heavy-ball algorithm and examine the conditions necessary for its global convergence. Gain insights from Iman Shames of The Australian National University as he presents his research on the heavy ball method's global convergence and asymptotic optimality.
Global Convergence and Asymptotic Optimality of the Heavy Ball Method
GERAD Research Center via YouTube
Overview
Syllabus
Global Convergence and Asymptotic Optimality of the Heavy Ball Method, Iman Shames
Taught by
GERAD Research Center