Completed
What do Our Vectors Represent?
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP 2019 - Convolutional Neural Networks for Language
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 A First Try: Bag of Words (BOW)
- 3 Continuous Bag of Words (CBOW) this movie
- 4 What do Our Vectors Represent?
- 5 Bag of n-grams hate
- 6 Why Bag of n-grams?
- 7 2-dimensional Convolutional Networks
- 8 CNNs for Sentence Modeling
- 9 Standard conv2d Function
- 10 Padding
- 11 Striding
- 12 Pooling . Pooling is like convolution, but calculates some reduction function feature-wise • Max pooling: "Did you see this feature anywhere in the range?" (most common) • Average pooling: How preval…
- 13 Stacked Convolution
- 14 Dilated Convolution (e.g. Kalchbrenner et al. 2016) . Gradually increase stride every time step (na reduction in length) sentence
- 15 Iterated Dilated Convolution (Strubell+2017) . Multiple iterations of the same stack of dilated convolutions
- 16 Non-linear Functions
- 17 Which Non-linearity Should I Use?