Neural Nets for NLP 2019 - Convolutional Neural Networks for Language

Neural Nets for NLP 2019 - Convolutional Neural Networks for Language

Graham Neubig via YouTube Direct link

Intro

1 of 17

1 of 17

Intro

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

Neural Nets for NLP 2019 - Convolutional Neural Networks for Language

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 A First Try: Bag of Words (BOW)
  3. 3 Continuous Bag of Words (CBOW) this movie
  4. 4 What do Our Vectors Represent?
  5. 5 Bag of n-grams hate
  6. 6 Why Bag of n-grams?
  7. 7 2-dimensional Convolutional Networks
  8. 8 CNNs for Sentence Modeling
  9. 9 Standard conv2d Function
  10. 10 Padding
  11. 11 Striding
  12. 12 Pooling . Pooling is like convolution, but calculates some reduction function feature-wise • Max pooling: "Did you see this feature anywhere in the range?" (most common) • Average pooling: How preval…
  13. 13 Stacked Convolution
  14. 14 Dilated Convolution (e.g. Kalchbrenner et al. 2016) . Gradually increase stride every time step (na reduction in length) sentence
  15. 15 Iterated Dilated Convolution (Strubell+2017) . Multiple iterations of the same stack of dilated convolutions
  16. 16 Non-linear Functions
  17. 17 Which Non-linearity Should I Use?

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.