Completed
Recurrent Neural Networks (Elman 1990)
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP - Recurrent Networks for Sentence or Language Modeling
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Why Model Sentence Pairs?
- 3 Siamese Network (Bromley et al. 1993)
- 4 Convolutional Matching Model (Hu et al. 2014) • Concatenate sentences into a 30 tensor and perform convolution
- 5 Convolutional Features + Matrix-based Pooling in and Schutze 2015
- 6 NLP and Sequential Data
- 7 Long-distance Dependencies in Language
- 8 Can be Complicated!
- 9 Recurrent Neural Networks (Elman 1990)
- 10 Unrolling in Time • What does processing a sequence look like?
- 11 What Can RNNs Do?
- 12 Representing Sentences
- 13 e.g. Language Modeling
- 14 RNNLM Example: Loss Calculation and State Update
- 15 Vanishing Gradient • Gradients decrease as they get pushed back
- 16 LSTM Structure
- 17 What can LSTMs Learn? (2) (Shi et al. 2016, Radford et al. 2017) Count length of sentence
- 18 Handling Long Sequences