Explore an in-depth analysis of global convergence rates and worst-case evaluation complexity for nonconvex smooth optimization methods in this 46-minute lecture by Coralia Cartis. Discover how steepest descent and Newton's methods achieve similar sharp performance bounds, and learn about the advantages of second-order regularization techniques. Examine the benefits of incorporating higher-order derivative information in regularization frameworks, leading to improved complexity, universal properties, and higher-order criticality certification. Investigate inexact settings with occasionally accurate derivatives and function evaluations, and their quantifiable worst-case complexity. Gain insights into robust optimization methods with varying, sharp, and sometimes optimal complexity across different scenarios.
Evaluation Complexity of Algorithms for Nonconvex Optimization
International Mathematical Union via YouTube
Overview
Syllabus
Coralia Cartis: Evaluation complexity of algorithms for nonconvex optimization
Taught by
International Mathematical Union