Completed
Intro
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP 2017 - Convolutional Networks for Text
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 An Example Prediction Problem: Sentence Classification
- 3 A First Try: Bag of Words (BOW)
- 4 Continuous Bag of Words (CBOW) movie
- 5 What do Our Vectors Represent?
- 6 Why Bag of n-grams?
- 7 What Problems w/ Bag of n-grams?
- 8 Time Delay Neural Networks (Waibel et al. 1989)
- 9 Convolutional Networks (LeCun et al. 1997)
- 10 Standard conv2d Function
- 11 Stacked Convolution
- 12 Dilated Convolution (e.g. Kalchbrenner et al. 2016)
- 13 An Aside: Nonlinear Functions • Proper choice of a non-linear function is essential in stacked networks
- 14 Why (Dilated) Convolution for Modeling Sentences? • In contrast to recurrent neural networks (next class)
- 15 Example: Dependency Structure
- 16 Why Model Sentence Pairs?
- 17 Siamese Network (Bromley et al. 1993)