Completed
An Answer (?): Word Embeddings!
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
CMU Neural Nets for NLP - Distributional Semantics and Word Vectors
Automatically move to the next video in the Classroom when playback concludes
- 1 Remember: Neural Models
- 2 How to Train Embeddings?
- 3 What do we want to know about words?
- 4 Contextualization of Word Representations
- 5 A Manual Attempt: WordNet
- 6 An Answer (?): Word Embeddings!
- 7 Distributional vs. Distributed Representations
- 8 Distributional Representations (see Goldberg 10.4.1)
- 9 Word Embeddings from Language Models
- 10 Context Window Methods
- 11 CBOW (Mikolov et al. 2013) • Predict word based on sum of surrounding embeddings
- 12 Glove (Pennington et al. 2014)
- 13 What Contexts?
- 14 Types of Evaluation
- 15 Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
- 16 Extrinsic Evaluation: Using Word Embeddings in Systems
- 17 How Do I Choose Embeddings?
- 18 Limitations of Embeddings
- 19 Sub-word Embeddings (2)