Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP - Latent Random Variables

Graham Neubig via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore latent random variables in neural networks for natural language processing through this comprehensive lecture from CMU's CS 11-747 course. Delve into the distinctions between generative and discriminative models, as well as deterministic and random variables. Examine variational autoencoders, their architecture, and challenges in training. Learn techniques for handling discrete latent variables, including enumeration, sampling, and reparameterization. Discover practical applications of variational models in language processing, controllable text generation, and symbol sequence modeling. Gain insights from examples and case studies presented throughout the lecture to deepen your understanding of these advanced NLP concepts.

Syllabus

Intro
Discriminative vs. Generative Models • Discriminative model: calculate the probability of output given
Quiz: What Types of Variables? • In the an attentional sequence-to-sequence model using MLE/teacher forcing, are the following variables observed or latent? deterministic or random?
Why Latent Random Variable
What is Latent Random Variable Model
A Latent Variable Model
An Example (Goersch 2016)
Variational Inference
Practice
Variational Autoencoders
VAE vs. AE
Problem! Sampling Breaks Backprop
Solution: Re-parameterization Trick
Motivation for Latent Variables • Allows for a consistent latent space of sentences?
Difficulties in Training
KL Divergence Annealing • Basic idea: Multiply KL term by a constant starting at zero, then gradually increase to 1 • Result: model can learn to use z before getting penalized
Solution 2: Weaken the Decoder . But theoretically still problematic: it can be shown that the optimal strategy is to ignore z when it is not necessary (Chen et al. 2017)
Aggressive Inference Network Learning
Discrete Latent Variables?
Enumeration
Method 2: Sampling • Randomly sample a subset of configurations of z and optimize with respect to this subset
Method 3: Reparameterization (Maddison et al. 2017, Jang et al. 2017)
Variational Models of Language Processing (Miao et al. 2016) • Present models with random variables for document modeling and question answer pair selection
Controllable Text Generation (Hu et al. 2017)
Symbol Sequence Latent Variables (Miao and Blunsom 2016) • Encoder-decoder with a sequence of latent symbols

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP - Latent Random Variables

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.