CMU Low Resource NLP Bootcamp 2020 - Neural Representation Learning

CMU Low Resource NLP Bootcamp 2020 - Neural Representation Learning

Graham Neubig via YouTube Direct link

Using non-contextualized when ...

14 of 25

14 of 25

Using non-contextualized when ...

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

CMU Low Resource NLP Bootcamp 2020 - Neural Representation Learning

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Neural Representation Learning in Natural Language Processing
  2. 2 Neural Representation Learning for NLP
  3. 3 What is the word representation ?
  4. 4 Why should we learn word representation
  5. 5 How can we get word representations?
  6. 6 Symbolic or Distributed?
  7. 7 Supervised or Unsupervised?
  8. 8 Count-based or Prediction-based?
  9. 9 Case Study: NNLM
  10. 10 Case Study: Glove
  11. 11 Case Study: ELMO
  12. 12 Case Study: BERT
  13. 13 Software, Model, Corpus
  14. 14 Using non-contextualized when ...
  15. 15 Using contextualized when ...
  16. 16 What is the sentence representation
  17. 17 Why do we need sentence representations
  18. 18 How can we learn sentence representations?
  19. 19 Different Structural Biases
  20. 20 Clusters of Approaches
  21. 21 Case Study: Must-know Points about RN
  22. 22 CNN: 1d and 2d Convolution
  23. 23 CNN: Narrow/Equal/Wide Convolution
  24. 24 CNN: Multiple Filter Convolution
  25. 25 Case Study: Must-know Points about Transforme

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.