Completed
Can be Complicated!
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP 2021 - Recurrent Neural Networks
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 NLP and Sequential Data
- 3 Long-distance Dependencies in Language
- 4 Can be Complicated!
- 5 Recurrent Neural Networks (Elman 1990)
- 6 Training RNNS
- 7 Parameter Tying
- 8 What Can RNNs Do?
- 9 Representing Sentences
- 10 e.g. Language Modeling
- 11 Vanishing Gradient . Gradients decrease as they get pushed back
- 12 A Solution: Long Short-term Memory (Hochreiter and Schmidhuber 1997)
- 13 LSTM Structure
- 14 What can LSTMs Learn? (1)
- 15 Handling Mini-batching
- 16 Mini-batching Method
- 17 Bucketing/Sorting
- 18 Optimized Implementations of LSTMs (Appleyard 2015)
- 19 Gated Recurrent Units (Cho et al. 2014)
- 20 Soft Hierarchical Stucture
- 21 Handling Long Sequences