Completed
Solution: "Inference Model" • Predict which latent point produced the data point using inference
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
CMU Neural Nets for NLP - Models with Latent Random Variables
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Discriminative vs. Generative Models • Discriminative model: calculate the probability of output given
- 3 Quiz: What Types of Variables?
- 4 Why Latent Random Variables?
- 5 An Example (Goersch 2016)
- 6 Problem: Straightforward Sampling is Inefficient
- 7 Solution: "Inference Model" • Predict which latent point produced the data point using inference
- 8 Disconnect Between Samples and Objective
- 9 VAE Objective • We can create an optimizable objective matching our problem, starting with KL divergence
- 10 Interpreting the VAE Objective
- 11 Problem! Sampling Breaks Backprop
- 12 Solution: Re-parameterization Trick
- 13 Generating from Language Models
- 14 Motivation for Latent Variables . Allows for a consistent latent space of sentences?
- 15 Difficulties in Training
- 16 KL Divergence Annealing
- 17 Weaken the Decoder
- 18 Discrete Latent Variables?
- 19 Method 1: Enumeration
- 20 Reparameterization (Maddison et al. 2017, Jang et al. 2017)
- 21 Gumbel-Softmax • A way to soften the decision and allow for continuous gradients
- 22 Variational Models of Language Processing (Miao et al. 2016)
- 23 Controllable Text Generation (Hu et al. 2017)
- 24 Symbol Sequence Latent Variables (Miao and Blunsom 2016)