Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Dimensionality Reduction for Matrix- and Tensor-Coded Data

MITCBMM via YouTube

Overview

Explore dimensionality reduction techniques for matrix- and tensor-coded data in this comprehensive lecture by Alex Williams from Stanford University. Delve into the theoretical foundations and practical applications of matrix and tensor factorizations, including PCA, non-negative matrix factorization (NMF), and independent components analysis (ICA). Focus on canonical polyadic (CP) tensor decomposition as an extension of PCA for higher-order data arrays. Learn about recent developments in the field, hands-on exercises, and practical advice for implementing these models. Cover topics such as the rotation problem, sparse PCA, Bayes rule, logistic PCA, loss functions, and cross-validation. Access additional resources, including slides, references, and exercises, to further enhance your understanding of these powerful data compression and analysis techniques.

Syllabus

Intro
Strategy
Other datasets
Imaging datasets
Matrix decomposition
Outline
Formal Definition
The Rotation Problem
NonNegative Matrix Factorization
Sparse Principal Components Analysis
L1 vs L2 penalties
Sparse PCA
Sparse NMF
Bayes Rule
Logistic PCA
Loss Functions
General Framework
Alternating minimization
In practice
Crossvalidation

Taught by

MITCBMM

Reviews

Start your review of Dimensionality Reduction for Matrix- and Tensor-Coded Data

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.