Completed
– Welcome to class
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Joint Embedding Method and Latent Variable Energy Based Models
Automatically move to the next video in the Classroom when playback concludes
- 1 – Welcome to class
- 2 – Predictive models
- 3 – Multi-output system
- 4 – Notation factor graph
- 5 – The energy function Fx, y
- 6 – Inference
- 7 – Implicit function
- 8 – Conditional EBM
- 9 – Unconditional EBM
- 10 – EBM vs. probabilistic models
- 11 – Do we need a y at inference?
- 12 – When inference is hard
- 13 – Joint embeddings
- 14 – Latent variables
- 15 – Inference with latent variables
- 16 – Energies E and F
- 17 – Preview on the EBM practicum
- 18 – From energy to probabilities
- 19 – Examples: K-means and sparse coding
- 20 – Limiting the information capacity of the latent variable
- 21 – Training EBMs
- 22 – Maximum likelihood
- 23 – How to pick β?
- 24 – Problems with maximum likelihood
- 25 – Other types of loss functions
- 26 – Generalised margin loss
- 27 – General group loss
- 28 – Contrastive joint embeddings
- 29 – Denoising or mask autoencoder
- 30 – Summary and final remarks