Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Non-negative Gauss-Newton Methods for Empirical Risk Minimization

Paul G. Allen School via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a distinguished seminar on optimization and data featuring Lin Xiao from Facebook AI Research. Delve into non-negative Gauss-Newton methods for empirical risk minimization, focusing on minimizing the average of numerous smooth but potentially non-convex functions. Learn how reformulating non-negative loss functions allows for the application of Gauss-Newton or Levenberg-Marquardt methods, resulting in highly adaptive algorithms. Discover the convergence analysis of these methods in convex, non-convex, and stochastic settings, comparing their performance to classical gradient methods. Gain insights from Lin Xiao's extensive experience in optimization theory and algorithms for deep learning and reinforcement learning, drawing from his work at Meta's Fundamental AI Research team and previous roles at Microsoft Research and top academic institutions.

Syllabus

Distinguished Seminar in Optimization and Data: Lin Xiao (Facebook AI Research)

Taught by

Paul G. Allen School

Reviews

Start your review of Non-negative Gauss-Newton Methods for Empirical Risk Minimization

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.