Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Sampling for Linear Algebra, Statistics, and Optimization I

Simons Institute via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the foundations of randomized numerical linear algebra in this lecture from the Foundations of Data Science Boot Camp. Delve into key concepts like element-wise sampling, row/column sampling, and random projections as preconditioners. Learn about approximating matrix multiplication, subspace embeddings, and the importance of leverage and condition in algorithms. Examine meta-algorithms for E-norm and Iz-norm regression, and discover structural results for least-squares approximation. Gain insights into RAM implementations and extensions to low-rank approximation using projections. Presented by Michael Mahoney from the International Computer Science Institute and UC Berkeley, this comprehensive talk provides a deep dive into sampling techniques for linear algebra, statistics, and optimization.

Syllabus

Intro
Outline Background and Overview
RandNLA: Randomized Numerical Linear Algebra
Basic RandNLA Principles
Element-wise Sampling
Row/column Sampling
Random Projections as Preconditioners
Approximating Matrix Multiplication
Subspace Embeddings
Two important notions: leverage and condition
Meta-algorithm for E-norm regression (2 of 3)
Meta-algorithm for Iz-norm regression (3 of 3)
Least-squares approximation: the basic structural result
Least-squares approximation: RAM implementations
Extensions to Low-rank Approximation (Projections)

Taught by

Simons Institute

Reviews

Start your review of Sampling for Linear Algebra, Statistics, and Optimization I

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.