Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Contrastive Methods and Regularised Latent Variable Models

Alfredo Canziani via YouTube

Overview

Dive into a comprehensive lecture on contrastive methods and regularized latent variable models in deep learning. Explore the advantages of contrastive methods in self-supervised learning, the architecture of denoising autoencoders, and other contrastive techniques like contrastive divergence. Examine regularized latent variable Energy-Based Models (EBMs) in detail, covering conditional and unconditional versions. Learn about algorithms such as ISTA, FISTA, and LISTA, and study examples of sparse coding and filters learned from convolutional sparse encoders. Conclude with an in-depth discussion on Variational Auto-Encoders and their underlying concepts. This lecture, delivered by Yann LeCun, is part of a broader deep learning course and offers nearly two hours of advanced content for those interested in cutting-edge machine learning techniques.

Syllabus

– Week 8 – Lecture
– Recap on EBM and Characteristics of Different Contrastive Methods
– Contrastive Methods in Self-Supervised Learning
– Denoising Autoencoder and other Contrastive methods
– Overview of Regularized Latent Variable Energy Based Models and Sparse Coding
– Convolutional Sparse Auto-Encoders
– Variational Auto-Encoders

Taught by

Alfredo Canziani

Reviews

Start your review of Contrastive Methods and Regularised Latent Variable Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.