Explore the complexities of sparse linear regression in this 54-minute lecture presented by Raghu Meka at IPAM's EnCORE Workshop. Delve into the challenges of statistical sample complexity and algorithmic efficiency, particularly with Gaussian covariates. Discover new methods that overcome limitations of traditional approaches like Lasso and basis pursuit, especially when dealing with ill-conditioned covariance matrices. Learn about innovative solutions for cases involving low treewidth dependency graphs, few bad correlations, or correlations arising from few latent variables. Examine the limitations of broad algorithm classes in this field. Gain insights from joint research with Jon Kelner, Frederic Koehler, and Dhruv Rohatgi in this comprehensive exploration of sparse linear regression's fundamental role in signal processing, statistics, and machine learning.
Overview
Syllabus
Raghu Meka - Complexity of Sparse Linear Regression - IPAM at UCLA
Taught by
Institute for Pure & Applied Mathematics (IPAM)