Joint Embedding Method and Latent Variable Energy Based Models

Joint Embedding Method and Latent Variable Energy Based Models

Alfredo Canziani via YouTube Direct link

– Limiting the information capacity of the latent variable

20 of 30

20 of 30

– Limiting the information capacity of the latent variable

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Joint Embedding Method and Latent Variable Energy Based Models

Automatically move to the next video in the Classroom when playback concludes

  1. 1 – Welcome to class
  2. 2 – Predictive models
  3. 3 – Multi-output system
  4. 4 – Notation factor graph
  5. 5 – The energy function Fx, y
  6. 6 – Inference
  7. 7 – Implicit function
  8. 8 – Conditional EBM
  9. 9 – Unconditional EBM
  10. 10 – EBM vs. probabilistic models
  11. 11 – Do we need a y at inference?
  12. 12 – When inference is hard
  13. 13 – Joint embeddings
  14. 14 – Latent variables
  15. 15 – Inference with latent variables
  16. 16 – Energies E and F
  17. 17 – Preview on the EBM practicum
  18. 18 – From energy to probabilities
  19. 19 – Examples: K-means and sparse coding
  20. 20 – Limiting the information capacity of the latent variable
  21. 21 – Training EBMs
  22. 22 – Maximum likelihood
  23. 23 – How to pick β?
  24. 24 – Problems with maximum likelihood
  25. 25 – Other types of loss functions
  26. 26 – Generalised margin loss
  27. 27 – General group loss
  28. 28 – Contrastive joint embeddings
  29. 29 – Denoising or mask autoencoder
  30. 30 – Summary and final remarks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.