Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Extragradient Methods - O(1/K) Last-Iterate Convergence for Monotone Variational Inequalities

GERAD Research Center via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the convergence properties of Extragradient methods for monotone variational inequalities in this 55-minute DS4DM Coffee Talk presented by Gauthier Gidel from Université de Montréal. Delve into the recent advancements in solving saddle point and variational inequalities problems, with a focus on their applications in machine learning, particularly Generative Adversarial Networks. Learn about the first last-iterate O(1/K) convergence rates for monotone and Lipschitz VIP without additional assumptions on the operator. Gain insights into the analysis based on Performance Estimation Problems and computer-aided proofs, understanding the non-trivial issues faced in obtaining final proofs through numerical computations. Examine the historical context of Extragradient and Past Extragradient methods, referencing the works of Korpelevich (1976) and Popov (1980), while exploring their renewed relevance in modern optimization challenges.

Syllabus

[Korpelevich, 1976] Korpelevich, G. M. 1976. The extragradient method for finding saddle points and other problems. Matecon, 7–756.

Taught by

GERAD Research Center

Reviews

Start your review of Extragradient Methods - O(1/K) Last-Iterate Convergence for Monotone Variational Inequalities

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.