CMU Neural Nets for NLP - Distributional Semantics and Word Vectors

CMU Neural Nets for NLP - Distributional Semantics and Word Vectors

Graham Neubig via YouTube Direct link

Distributional vs. Distributed Representations

7 of 19

7 of 19

Distributional vs. Distributed Representations

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

CMU Neural Nets for NLP - Distributional Semantics and Word Vectors

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Remember: Neural Models
  2. 2 How to Train Embeddings?
  3. 3 What do we want to know about words?
  4. 4 Contextualization of Word Representations
  5. 5 A Manual Attempt: WordNet
  6. 6 An Answer (?): Word Embeddings!
  7. 7 Distributional vs. Distributed Representations
  8. 8 Distributional Representations (see Goldberg 10.4.1)
  9. 9 Word Embeddings from Language Models
  10. 10 Context Window Methods
  11. 11 CBOW (Mikolov et al. 2013) • Predict word based on sum of surrounding embeddings
  12. 12 Glove (Pennington et al. 2014)
  13. 13 What Contexts?
  14. 14 Types of Evaluation
  15. 15 Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
  16. 16 Extrinsic Evaluation: Using Word Embeddings in Systems
  17. 17 How Do I Choose Embeddings?
  18. 18 Limitations of Embeddings
  19. 19 Sub-word Embeddings (2)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.