Completed
Reminder: Optimizers
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Debugging Neural Nets for NLP
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 In Neural Networks, Tuning is Paramount!
- 3 A Typical Situation
- 4 Possible Causes
- 5 Identifying Training Time Problems
- 6 Is My Model Too Weak? Your model needs to be big enough to learn . Model size depends on task . For language modeling, at least 512 nodes • For natural language analysis, 128 or so may do . Multiple …
- 7 Be Careful of Deep Models
- 8 Trouble w/ Optimization
- 9 Reminder: Optimizers
- 10 Initialization
- 11 Bucketing/Sorting • If we use sentences of different lengths, too much padding and sorting can result in slow training • To remedy this sort sentences so similarly-lengthed sentences are in the same …
- 12 Debugging Decoding
- 13 Beam Search
- 14 Debugging Search
- 15 Look At Your Data!
- 16 Symptoms of Overfitting
- 17 Reminder: Dev-driven Learning Rate Decay Start w/ a high learning rate, then degrade learning rate when start overfitting the development set (the newbob learning rate schedule)