Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the convergence properties of Extragradient methods for monotone variational inequalities in this 55-minute DS4DM Coffee Talk presented by Gauthier Gidel from Université de Montréal. Delve into the recent advancements in solving saddle point and variational inequalities problems, with a focus on their applications in machine learning, particularly Generative Adversarial Networks. Learn about the first last-iterate O(1/K) convergence rates for monotone and Lipschitz VIP without additional assumptions on the operator. Gain insights into the analysis based on Performance Estimation Problems and computer-aided proofs, understanding the non-trivial issues faced in obtaining final proofs through numerical computations. Examine the historical context of Extragradient and Past Extragradient methods, referencing the works of Korpelevich (1976) and Popov (1980), while exploring their renewed relevance in modern optimization challenges.