Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore dimensionality reduction techniques for matrix- and tensor-coded data in this comprehensive lecture by Alex Williams from Stanford University. Delve into the theoretical foundations and practical applications of matrix and tensor factorizations, including PCA, non-negative matrix factorization (NMF), and independent components analysis (ICA). Focus on canonical polyadic (CP) tensor decomposition as an extension of PCA for higher-order data arrays. Learn about recent developments in the field, hands-on exercises, and practical advice for implementing these models. Cover topics such as the rotation problem, sparse PCA, Bayes rule, logistic PCA, loss functions, and cross-validation. Access additional resources, including slides, references, and exercises, to further enhance your understanding of these powerful data compression and analysis techniques.
Syllabus
Intro
Strategy
Other datasets
Imaging datasets
Matrix decomposition
Outline
Formal Definition
The Rotation Problem
NonNegative Matrix Factorization
Sparse Principal Components Analysis
L1 vs L2 penalties
Sparse PCA
Sparse NMF
Bayes Rule
Logistic PCA
Loss Functions
General Framework
Alternating minimization
In practice
Crossvalidation
Taught by
MITCBMM