Overview
Dive into a comprehensive lecture on contrastive methods and regularized latent variable models in deep learning. Explore the advantages of contrastive methods in self-supervised learning, the architecture of denoising autoencoders, and other contrastive techniques like contrastive divergence. Examine regularized latent variable Energy-Based Models (EBMs) in detail, covering conditional and unconditional versions. Learn about algorithms such as ISTA, FISTA, and LISTA, and study examples of sparse coding and filters learned from convolutional sparse encoders. Conclude with an in-depth discussion on Variational Auto-Encoders and their underlying concepts. This lecture, delivered by Yann LeCun, is part of a broader deep learning course and offers nearly two hours of advanced content for those interested in cutting-edge machine learning techniques.
Syllabus
– Week 8 – Lecture
– Recap on EBM and Characteristics of Different Contrastive Methods
– Contrastive Methods in Self-Supervised Learning
– Denoising Autoencoder and other Contrastive methods
– Overview of Regularized Latent Variable Energy Based Models and Sparse Coding
– Convolutional Sparse Auto-Encoders
– Variational Auto-Encoders
Taught by
Alfredo Canziani