Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore an in-depth analysis of global convergence rates and worst-case evaluation complexity for nonconvex smooth optimization methods in this 46-minute lecture by Coralia Cartis. Discover how steepest descent and Newton's methods achieve similar sharp performance bounds, and learn about the advantages of second-order regularization techniques. Examine the benefits of incorporating higher-order derivative information in regularization frameworks, leading to improved complexity, universal properties, and higher-order criticality certification. Investigate inexact settings with occasionally accurate derivatives and function evaluations, and their quantifiable worst-case complexity. Gain insights into robust optimization methods with varying, sharp, and sometimes optimal complexity across different scenarios.