Neural Nets for NLP 2021 - Conditioned Generation

Neural Nets for NLP 2021 - Conditioned Generation

Graham Neubig via YouTube Direct link

Still a Difficult Problem!

19 of 27

19 of 27

Still a Difficult Problem!

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2021 - Conditioned Generation

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Language Models • Language models are generative models of text
  3. 3 Conditioned Language Models
  4. 4 Calculating the Probability of a Sentence
  5. 5 Conditional Language Models
  6. 6 One Type of Language Model Mikolov et al. 2011
  7. 7 How to Pass Hidden State?
  8. 8 The Generation Problem
  9. 9 Ancestral Sampling
  10. 10 Greedy Search
  11. 11 Beam Search
  12. 12 Ensembling . Combine predictions from multiple models
  13. 13 Linear Interpolation • Take a weighted average of the M model probabilities
  14. 14 Log-linear Interpolation • Weighted combination of log probabilities, normalize
  15. 15 Linear or Log Linear?
  16. 16 Parameter Averaging
  17. 17 Ensemble Distillation (e.g. Kim et al. 2016)
  18. 18 Stacking
  19. 19 Still a Difficult Problem!
  20. 20 From Speaker/Document Traits (Hoang et al. 2016)
  21. 21 From Lists of Traits (Kiddon et al. 2016)
  22. 22 From Word Embeddings (Noraset et al. 2017)
  23. 23 Basic Evaluation Paradigm
  24. 24 Human Evaluation Shared Tasks
  25. 25 Embedding-based Metrics
  26. 26 Perplexity
  27. 27 Which One to Use?

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.