Exploiting Sparsity and Structure in Parametric and Nonparametric Estimation - 2007
Center for Language & Speech Processing(CLSP), JHU via YouTube
Overview
Explore new findings on sparse estimation in parametric graphical models and nonparametric regression in high dimensions in this 1-hour 5-minute lecture by John Lafferty from Carnegie Mellon University. Delve into l1 regularization techniques for estimating graph structures in high-dimensional settings and discover a novel nonparametric lasso method that regularizes estimator derivatives. Examine the challenges of semi-supervised learning and how unlabeled data can potentially enhance estimation. Analyze current regularization methods through the lens of minimax theory and learn about new approaches that yield improved convergence rates. Gain insights from Lafferty's extensive background in machine learning, statistical learning theory, computational statistics, and natural language processing as he presents joint work with collaborators in this Center for Language & Speech Processing talk at Johns Hopkins University.
Syllabus
Exploiting Sparsity and Structure in Parametric and Nonparametric Estimation – John Lafferty - 2007
Taught by
Center for Language & Speech Processing(CLSP), JHU