Neural Nets for NLP - Debugging Neural Nets for NLP

Neural Nets for NLP - Debugging Neural Nets for NLP

Graham Neubig via YouTube Direct link

Intro

1 of 22

1 of 22

Intro

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP - Debugging Neural Nets for NLP

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 In Neural Networks, Tuning is Paramount!
  3. 3 A Typical Situation
  4. 4 Identifying Training Time Problems
  5. 5 Is My Model Too Weak?
  6. 6 Be Careful of Deep Models
  7. 7 Trouble w/ Optimization
  8. 8 Reminder: Optimizers - SGD: take a step in the direction of the gradient
  9. 9 Learning Rate Learning rate is an important parameter
  10. 10 Initialization
  11. 11 Debugging Minibatching
  12. 12 Debugging Decoding
  13. 13 Debugging Search
  14. 14 Look At Your Data!
  15. 15 Quantitative Analysis
  16. 16 Symptoms of Overfitting
  17. 17 Reminder: Early Stopping, Learning Rate Decay
  18. 18 Reminder: Dropout (Srivastava et al. 2014) Neural nets have lots of parameters, and are prone to overfitting • Dropout: randomly zero-out nodes in the hidden layer with probability p at training time…
  19. 19 A Stark Example (Koehn and Knowles 2017) • Better search (=better model score) can result in worse BLEU score!
  20. 20 Managing Loss Function/ Eval Metric Differences Most principled way: use structured prediction techniques to be discussed in future classes
  21. 21 A Simple Method: Early Stopping w/ Eval Metric
  22. 22 Reproducing Previous Work

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.