Neural Nets for NLP 2019 - Advanced Search Algorithms

Neural Nets for NLP 2019 - Advanced Search Algorithms

Graham Neubig via YouTube Direct link

Intro

1 of 25

1 of 25

Intro

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2019 - Advanced Search Algorithms

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Why search?
  3. 3 Basic Pruning Methods (Steinbiss et al. 1994)
  4. 4 Prediction-based Pruning Methods (e.g. Stern et al. 2017)
  5. 5 Backtracking-based Pruning Methods
  6. 6 What beam size should use?
  7. 7 Variable length output sequences . In many tasks (eg MT), the output sequences will be of variable length
  8. 8 More complicated normalization Google's Neural Machine Translation System Bridging the Gap
  9. 9 Predict the output length (Eriguchi et al. 2016)
  10. 10 Why do Bigger Beams Hurt, pt. 2
  11. 11 Dealing with disparity in actions Ellective Inference for Generative Neural Parsing Mitchell Stam et al., 2017
  12. 12 Solution
  13. 13 Improving Diversity in top N Choices
  14. 14 Improving Diversity through Sampling
  15. 15 Sampling without Replacement (con't)
  16. 16 Monte-Carlo Tree Search Human-like Natural Language Generation Using Monte Carlo Tree Search
  17. 17 More beam search in training A Continuous Relaxation of Bear Search for End-to-end Training of Neural Sequence Models (Goyal et al., 2017)
  18. 18 Adoption with neural networks: CCG Parsing
  19. 19 Is the heuristic admissible? (Lee et al. 2016)
  20. 20 Estimating future costs Li et al., 2017
  21. 21 Actor Critic (Bahdanau et. al., 2017)
  22. 22 Actor Critic (continued)
  23. 23 A* search: benefits and drawbacks
  24. 24 Particle Filters (Buys et al., 2015)
  25. 25 Reranking (Dyer et al. 2016)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.