Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Random Initialization and Implicit Regularization in Nonconvex Statistical Estimation - Lecture 2

Georgia Tech Research via YouTube

Overview

Explore the second lecture in a five-part series featuring Princeton University's Yuxin Chen, focusing on random initialization and implicit regularization in nonconvex statistical estimation. Delve into the phenomenon where gradient descent converges to optimal solutions in nonconvex problems like phase retrieval and matrix completion, achieving near-optimal statistical and computational guarantees without careful initialization or explicit regularization. Examine the leave-one-out approach used to decouple statistical dependency between gradient descent iterates and data. Learn about the application of this method to noisy matrix completion, demonstrating near-optimal entrywise error control. Investigate topics such as low-rank matrix recovery, quadratic systems of equations, two-stage approaches, population-level state evolution, and automatic saddle avoidance in this 48-minute talk from the TRIAD Distinguished Lecture Series at Georgia Tech Research.

Syllabus

Intro
Statistical models come to rescue
Example: low-rank matrix recovery
Solving quadratic systems of equations
A natural least squares formulation
Rationale of two-stage approach
What does prior theory say?
Exponential growth of signal strength in Stage 1
Our theory: noiseless case
Population-level state evolution
Back to finite-sample analysis
Gradient descent theory revisited
A second look at gradient descent theory
Key proof idea: leave-one-out analysis
Key proof ingredient: random-sign sequences
Automatic saddle avoidance

Taught by

Georgia Tech Research

Reviews

Start your review of Random Initialization and Implicit Regularization in Nonconvex Statistical Estimation - Lecture 2

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.