CMU Neural Nets for NLP - Models with Latent Random Variables

CMU Neural Nets for NLP - Models with Latent Random Variables

Graham Neubig via YouTube Direct link

Solution: "Inference Model" • Predict which latent point produced the data point using inference

7 of 24

7 of 24

Solution: "Inference Model" • Predict which latent point produced the data point using inference

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

CMU Neural Nets for NLP - Models with Latent Random Variables

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Discriminative vs. Generative Models • Discriminative model: calculate the probability of output given
  3. 3 Quiz: What Types of Variables?
  4. 4 Why Latent Random Variables?
  5. 5 An Example (Goersch 2016)
  6. 6 Problem: Straightforward Sampling is Inefficient
  7. 7 Solution: "Inference Model" • Predict which latent point produced the data point using inference
  8. 8 Disconnect Between Samples and Objective
  9. 9 VAE Objective • We can create an optimizable objective matching our problem, starting with KL divergence
  10. 10 Interpreting the VAE Objective
  11. 11 Problem! Sampling Breaks Backprop
  12. 12 Solution: Re-parameterization Trick
  13. 13 Generating from Language Models
  14. 14 Motivation for Latent Variables . Allows for a consistent latent space of sentences?
  15. 15 Difficulties in Training
  16. 16 KL Divergence Annealing
  17. 17 Weaken the Decoder
  18. 18 Discrete Latent Variables?
  19. 19 Method 1: Enumeration
  20. 20 Reparameterization (Maddison et al. 2017, Jang et al. 2017)
  21. 21 Gumbel-Softmax • A way to soften the decision and allow for continuous gradients
  22. 22 Variational Models of Language Processing (Miao et al. 2016)
  23. 23 Controllable Text Generation (Hu et al. 2017)
  24. 24 Symbol Sequence Latent Variables (Miao and Blunsom 2016)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.