CMU Neural Nets for NLP 2018 - Models of Words

CMU Neural Nets for NLP 2018 - Models of Words

Graham Neubig via YouTube Direct link

Glove (Pennington et al. 2014)

11 of 21

11 of 21

Glove (Pennington et al. 2014)

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

CMU Neural Nets for NLP 2018 - Models of Words

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 What do we want to know about words?
  3. 3 A Manual Attempt: WordNet
  4. 4 An Answer (?): Word Embeddings!
  5. 5 How to Train Word Embeddings?
  6. 6 Distributional vs. Distributed Representations
  7. 7 Count-based Methods
  8. 8 Distributional Representations (see Goldberg 10.4.1) • Words appear in a context
  9. 9 Context Window Methods
  10. 10 Count-based and Prediction-based Methods
  11. 11 Glove (Pennington et al. 2014)
  12. 12 What Contexts?
  13. 13 Types of Evaluation
  14. 14 Non-linear Projection • Non-linear projections group things that are close in high- dimensional space eg. SNEA-SNE (van der Masten and Hinton 2008) group things that give each other a high probabilit…
  15. 15 t-SNE Visualization can be Misleading! (Wattenberg et al. 2016)
  16. 16 Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
  17. 17 Extrinsic Evaluation: Using Word Embeddings in Systems
  18. 18 How Do I Choose Embeddings?
  19. 19 When are Pre-trained Embeddings Useful?
  20. 20 Limitations of Embeddings
  21. 21 Sub-word Embeddings (1)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.