CMU Neural Nets for NLP 2018 - Conditioned Generation

CMU Neural Nets for NLP 2018 - Conditioned Generation

Graham Neubig via YouTube Direct link

A Contrastive Note: Evaluating Unconditioned Generation

13 of 13

13 of 13

A Contrastive Note: Evaluating Unconditioned Generation

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

CMU Neural Nets for NLP 2018 - Conditioned Generation

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Language Models Language models are generative models of text
  3. 3 Conditioned Language Models
  4. 4 Ancestral Sampling
  5. 5 Ensembling
  6. 6 Linear or Log Linear?
  7. 7 Parameter Averaging
  8. 8 Ensemble Distillation (e.g. Kim et al. 2016)
  9. 9 Stacking
  10. 10 Basic Evaluation Paradigm
  11. 11 Human Evaluation
  12. 12 Perplexity
  13. 13 A Contrastive Note: Evaluating Unconditioned Generation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.