PCA, AE, K-Means, Gaussian Mixture Model, Sparse Coding, and Intuitive VAE

PCA, AE, K-Means, Gaussian Mixture Model, Sparse Coding, and Intuitive VAE

Alfredo Canziani via YouTube Direct link

– 5. VAE: an intuitive interpretation

18 of 20

18 of 20

– 5. VAE: an intuitive interpretation

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

PCA, AE, K-Means, Gaussian Mixture Model, Sparse Coding, and Intuitive VAE

Automatically move to the next video in the Classroom when playback concludes

  1. 1 – Welcome to class
  2. 2 – Training methods revisited
  3. 3 – Architectural methods
  4. 4 – 1. PCA
  5. 5 – Q&A on Definitions: Labels, unconditional, and un, selfsupervised learning
  6. 6 – 2. Auto-encoder with Bottleneck
  7. 7 – 3. K-Means
  8. 8 – 4. Gaussian mixture model
  9. 9 – Regularized EBM
  10. 10 – Yann out of context
  11. 11 – Q&A on Norms and Posterior: when the student is thinking too far ahead
  12. 12 – 1. Unconditional regularized latent variable EBM: Sparse coding
  13. 13 – Sparse modeling on MNIST & natural patches
  14. 14 – 2. Amortized inference
  15. 15 – ISTA algorithm & RNN Encoder
  16. 16 – 3. Convolutional sparce coding
  17. 17 – 4. Video prediction: very briefly
  18. 18 – 5. VAE: an intuitive interpretation
  19. 19 – Helpful whiteboard stuff
  20. 20 – Another interpretation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.