Completed
Non-linear Projection • Non-linear projections group things that are close in high- dimensional space eg. SNEA-SNE (van der Masten and Hinton 2008) group things that give each other a high probabilit…
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
CMU Neural Nets for NLP 2018 - Models of Words
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 What do we want to know about words?
- 3 A Manual Attempt: WordNet
- 4 An Answer (?): Word Embeddings!
- 5 How to Train Word Embeddings?
- 6 Distributional vs. Distributed Representations
- 7 Count-based Methods
- 8 Distributional Representations (see Goldberg 10.4.1) • Words appear in a context
- 9 Context Window Methods
- 10 Count-based and Prediction-based Methods
- 11 Glove (Pennington et al. 2014)
- 12 What Contexts?
- 13 Types of Evaluation
- 14 Non-linear Projection • Non-linear projections group things that are close in high- dimensional space eg. SNEA-SNE (van der Masten and Hinton 2008) group things that give each other a high probabilit…
- 15 t-SNE Visualization can be Misleading! (Wattenberg et al. 2016)
- 16 Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
- 17 Extrinsic Evaluation: Using Word Embeddings in Systems
- 18 How Do I Choose Embeddings?
- 19 When are Pre-trained Embeddings Useful?
- 20 Limitations of Embeddings
- 21 Sub-word Embeddings (1)