Overview
Syllabus
– Welcome to class
– Training methods revisited
– Architectural methods
– 1. PCA
– Q&A on Definitions: Labels, unconditional, and un, selfsupervised learning
– 2. Auto-encoder with Bottleneck
– 3. K-Means
– 4. Gaussian mixture model
– Regularized EBM
– Yann out of context
– Q&A on Norms and Posterior: when the student is thinking too far ahead
– 1. Unconditional regularized latent variable EBM: Sparse coding
– Sparse modeling on MNIST & natural patches
– 2. Amortized inference
– ISTA algorithm & RNN Encoder
– 3. Convolutional sparce coding
– 4. Video prediction: very briefly
– 5. VAE: an intuitive interpretation
– Helpful whiteboard stuff
– Another interpretation
Taught by
Alfredo Canziani