Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2017 - Conditioned Generation

Graham Neubig via YouTube

Overview

Explore a comprehensive lecture on conditioned generation in neural networks for natural language processing. Delve into encoder-decoder models, conditional generation techniques, and search algorithms. Learn about ensembling methods, evaluation strategies, and various types of data used for conditioning. Access accompanying slides and code examples for hands-on learning. Gain insights into language models, the generation problem, and evaluation paradigms, including human evaluation and perplexity. Part of CMU's Neural Networks for NLP course, this lecture provides essential knowledge for understanding and implementing advanced NLP techniques.

Syllabus

Intro
Language Models • Language models are generative models of text
Conditioned Language Models
Conditional Language Models
One Type of Conditional Language Model Sutskever et al. 2014
The Generation Problem
Ancestral Sampling
Greedy Search
Ensembling Combine predictions from multiple models
Log-linear Interpolation Weighted combination of log probabilities, normalize
Linear or Log Linear?
Parameter Averaging
Stacking
Basic Evaluation Paradigm
Human Evaluation
Perplexity

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2017 - Conditioned Generation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.