Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

The Effectiveness of Nonconvex Tensor Completion - Fast Convergence and Uncertainty Quantification

Institute for Pure & Applied Mathematics (IPAM) via YouTube

Overview

Explore the effectiveness of nonconvex optimization for noisy tensor completion in this 33-minute conference talk from the Tensor Methods and Emerging Applications to the Physical and Data Sciences 2021 workshop. Delve into Yuxin Chen's presentation on a two-stage nonconvex algorithm that addresses the high-volatility issue in sample-starved regimes, enabling linear convergence, minimal sample complexity, and minimax statistical accuracy. Learn about the characterization of the nonconvex estimator's distribution and its application in constructing entrywise confidence intervals for unseen tensor entries and unknown tensor factors. Gain insights into the role of statistical models in facilitating efficient and guaranteed nonconvex statistical learning, covering topics such as imperfect data acquisition, statistical computational gaps, gradient descent challenges, and key proof ideas like leave-one-out decoupling.

Syllabus

Intro
Imperfect data acquisition
Statistical computational gap
Prior art
A nonconvex least squares formulation
Gradient descent (GD) with random initialization?
A negative conjecture
Our proposal: a two-stage nonconvex algorithm
Rationale of two-stage approach
A bit more details about initialization
Assumptions
Numerical experiments
No need of sample splitting
Key proof ideas leave one-out decoupling
Distributional theory
Back to estimation

Taught by

Institute for Pure & Applied Mathematics (IPAM)

Reviews

Start your review of The Effectiveness of Nonconvex Tensor Completion - Fast Convergence and Uncertainty Quantification

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.