Completed
Intro
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP 2019 - Word Vectors
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 What do we want to know about words?
- 3 A Manual Attempt: WordNet
- 4 An Answer (?): Word Embeddings!
- 5 Word Embeddings are Cool! (An Obligatory Slide)
- 6 How to Train Word Embeddings?
- 7 Distributional Representations (see Goldberg 10.4.1)
- 8 Count-based Methods
- 9 Word Embeddings from Language Models
- 10 Context Window Methods
- 11 Skip-gram (Mikolov et al. 2013) • Predict each word in the context given the word
- 12 Count-based and Prediction-based Methods
- 13 Glove (Pennington et al. 2014)
- 14 What Contexts?
- 15 Types of Evaluation
- 16 Extrinsic Evaluation: Using Word Embeddings in Systems
- 17 Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
- 18 Limitations of Embeddings
- 19 Sub-word Embeddings (1)
- 20 Multi-prototype Embeddings • Simple idea, words with multiple meanings should have different embeddings (Reisinger and Mooney 2010)