Completed
Generating from Language Models
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP - Latent Variable Models
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Discriminative vs. Generative Models
- 3 Quiz: What Types of Variables?
- 4 Why Latent Random Variables?
- 5 A Latent Variable Model
- 6 What is Our Loss Function? . We would like to maximize the corpus log likelihood
- 7 Disconnect Between Samples and Objective
- 8 VAE Objective . We can create an optimizable objective matching our problem, starting with KL divergence
- 9 Interpreting the VAE Objective
- 10 Problem: Straightforward Sampling is Inefficient Current
- 11 Problem! Sampling Breaks Backprop
- 12 Solution: Re-parameterization Trick
- 13 Generating from Language Models
- 14 Motivation for Latent Variables
- 15 Difficulties in Training
- 16 KL Divergence Annealing
- 17 Weaken the Decoder
- 18 Discrete Latent Variables?
- 19 Enumeration
- 20 Method 2: Sampling
- 21 Reparameterization (Maddison et al. 2017. Jang et al. 2017)