Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

SGD in the Large - Average-Case Analysis, Asymptotics, and Stepsize Criticality

Fields Institute via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intricacies of Stochastic Gradient Descent (SGD) in a comprehensive lecture delivered by Courtney Paquette from McGill University at the Machine Learning Advances and Applications Seminar. Delve into average-case analysis, asymptotics, and stepsize criticality as key components of SGD. Examine optimization problems, average-case complexity, and the role of randomness and distribution in SGD. Investigate the SGD congruence theorem and its implications for worst-case scenarios. Uncover the nuances of stepsize criticality and its impact on average-case complexity. Learn about stochastic momentum, the stochastic heavy ball method, and the significance of momentum parameters. Discover dimension-dependent momentum and its applications in logistic regression. Gain valuable insights into the average-case analysis of SGD and its relevance in machine learning applications.

Syllabus

Introduction
Optimization Problems
Averagecase Complexity
Randomness
Distribution
SGD Example
SGD Worst Case
SGD Congruence Theorem
Stepsize Criticality
Average Case Complexity
Stochastic Momentum
Stochastic Heavy Ball
Momentum Parameters
Dimension Dependent Momentum
Thank You
Logistic Regression
Averagecase Analysis

Taught by

Fields Institute

Reviews

Start your review of SGD in the Large - Average-Case Analysis, Asymptotics, and Stepsize Criticality

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.