Neural Nets for NLP 2019 - Word Vectors

Neural Nets for NLP 2019 - Word Vectors

Graham Neubig via YouTube Direct link

Multi-prototype Embeddings • Simple idea, words with multiple meanings should have different embeddings (Reisinger and Mooney 2010)

20 of 20

20 of 20

Multi-prototype Embeddings • Simple idea, words with multiple meanings should have different embeddings (Reisinger and Mooney 2010)

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2019 - Word Vectors

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 What do we want to know about words?
  3. 3 A Manual Attempt: WordNet
  4. 4 An Answer (?): Word Embeddings!
  5. 5 Word Embeddings are Cool! (An Obligatory Slide)
  6. 6 How to Train Word Embeddings?
  7. 7 Distributional Representations (see Goldberg 10.4.1)
  8. 8 Count-based Methods
  9. 9 Word Embeddings from Language Models
  10. 10 Context Window Methods
  11. 11 Skip-gram (Mikolov et al. 2013) • Predict each word in the context given the word
  12. 12 Count-based and Prediction-based Methods
  13. 13 Glove (Pennington et al. 2014)
  14. 14 What Contexts?
  15. 15 Types of Evaluation
  16. 16 Extrinsic Evaluation: Using Word Embeddings in Systems
  17. 17 Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
  18. 18 Limitations of Embeddings
  19. 19 Sub-word Embeddings (1)
  20. 20 Multi-prototype Embeddings • Simple idea, words with multiple meanings should have different embeddings (Reisinger and Mooney 2010)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.