Neural Nets for NLP - Models with Latent Random Variables

Neural Nets for NLP - Models with Latent Random Variables

Graham Neubig via YouTube Direct link

Reparameterization Trick

14 of 33

14 of 33

Reparameterization Trick

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP - Models with Latent Random Variables

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Introduction
  2. 2 Discriminative vs generative
  3. 3 Observed vs latent variables
  4. 4 Quiz
  5. 5 Latent Variable Models
  6. 6 Types of latent random variables
  7. 7 Example
  8. 8 Loss Function
  9. 9 Variational inference
  10. 10 Reconstruction loss and kl regularizer
  11. 11 Regularized auto encoder
  12. 12 Regularized autoencoder
  13. 13 Learning the VAE
  14. 14 Reparameterization Trick
  15. 15 General
  16. 16 Language
  17. 17 VAE
  18. 18 Reparameterization
  19. 19 Motivation
  20. 20 Consistency
  21. 21 Semantic Similarity
  22. 22 Solutions
  23. 23 Free Bits
  24. 24 Weaken Decoder
  25. 25 Aggressive Inference Network
  26. 26 Handling Discrete latent variables
  27. 27 Discrete latent variables
  28. 28 Sampling discrete variables
  29. 29 Gumball softmax
  30. 30 Application examples
  31. 31 Discrete random variables
  32. 32 Tree structured latent variables
  33. 33 Discussion question

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.