Completed
Retrofitting of Embeddings to Existing Lexicons . We have an existing lexicon like WordNet, and would like our vectors to match (Faruqui et al. 2015)
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP 2021 - Distributional Semantics and Word Vectors
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Remember: Neural Models
- 3 How to Train Embeddings?
- 4 What do we want to know about words?
- 5 Contextualization of Word Representations
- 6 A Manual Attempt: WordNet
- 7 An Answer (?): Word Embeddings!
- 8 Word Embeddings are Cool! (An Obligatory Slide)
- 9 Distributional vs. Distributed Representations
- 10 Distributional Representations (see Goldberg 10.4.1)
- 11 Count-based Methods
- 12 Prediction-basd Methods (See Goldberg 10.4.2)
- 13 Word Embeddings from Language Models giving
- 14 Context Window Methods
- 15 Glove (Pennington et al. 2014)
- 16 What Contexts?
- 17 Types of Evaluation
- 18 Non-linear Projection • Non-linear projections group things that are close in high
- 19 t-SNE Visualization can be Misleading! Wattenberg et al. 2016
- 20 Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
- 21 Extrinsic Evaluation
- 22 How Do I Choose Embeddings?
- 23 When are Pre-trained Embeddings Useful?
- 24 Limitations of Embeddings
- 25 Unsupervised Coordination of Embeddings
- 26 Retrofitting of Embeddings to Existing Lexicons . We have an existing lexicon like WordNet, and would like our vectors to match (Faruqui et al. 2015)
- 27 Sparse Embeddings
- 28 De-biasing Word
- 29 FastText Toolkit