Completed
Conditioned Language Models
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
CMU Neural Nets for NLP 2018 - Conditioned Generation
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Language Models Language models are generative models of text
- 3 Conditioned Language Models
- 4 Ancestral Sampling
- 5 Ensembling
- 6 Linear or Log Linear?
- 7 Parameter Averaging
- 8 Ensemble Distillation (e.g. Kim et al. 2016)
- 9 Stacking
- 10 Basic Evaluation Paradigm
- 11 Human Evaluation
- 12 Perplexity
- 13 A Contrastive Note: Evaluating Unconditioned Generation