Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn about random projection techniques in data mining through this 38-minute lecture that explores the relationship between Principal Component Analysis (PCA) and Singular Value Decomposition (SVD), before diving into random projection methods. Understand the mathematical foundations starting with best rank-k approximations, then progress to the Johnson-Lindenstrauss lemma and its implications for dimensionality reduction. Discover how to determine the minimum k value for random projections, examine the algorithm's implementation, and grasp the theoretical underpinnings that make this technique effective. Conclude with a concise mathematical formulation that ties all concepts together.
Syllabus
Lecture starts
Best rank-k approximation recap
PCA vs. SVD
Random projection motivation
Johnson-Lindenstrauss lemma
Min k for which random projection are designed for
Random projection algorithm
Why does this work?
Compactly written version
Lecture ends
Taught by
UofU Data Science