Completed
Intro
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP - Debugging Neural Nets for NLP
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 In Neural Networks, Tuning is Paramount!
- 3 A Typical Situation
- 4 Identifying Training Time Problems
- 5 Is My Model Too Weak?
- 6 Be Careful of Deep Models
- 7 Trouble w/ Optimization
- 8 Reminder: Optimizers - SGD: take a step in the direction of the gradient
- 9 Learning Rate Learning rate is an important parameter
- 10 Initialization
- 11 Debugging Minibatching
- 12 Debugging Decoding
- 13 Debugging Search
- 14 Look At Your Data!
- 15 Quantitative Analysis
- 16 Symptoms of Overfitting
- 17 Reminder: Early Stopping, Learning Rate Decay
- 18 Reminder: Dropout (Srivastava et al. 2014) Neural nets have lots of parameters, and are prone to overfitting • Dropout: randomly zero-out nodes in the hidden layer with probability p at training time…
- 19 A Stark Example (Koehn and Knowles 2017) • Better search (=better model score) can result in worse BLEU score!
- 20 Managing Loss Function/ Eval Metric Differences Most principled way: use structured prediction techniques to be discussed in future classes
- 21 A Simple Method: Early Stopping w/ Eval Metric
- 22 Reproducing Previous Work