Overview
Syllabus
Intro
What do we want to know about words?
A Manual Attempt: WordNet
An Answer (?): Word Embeddings!
Word Embeddings are Cool! (An Obligatory Slide)
How to Train Word Embeddings?
Distributional Representations (see Goldberg 10.4.1)
Count-based Methods
Word Embeddings from Language Models
Context Window Methods
Skip-gram (Mikolov et al. 2013) • Predict each word in the context given the word
Count-based and Prediction-based Methods
Glove (Pennington et al. 2014)
What Contexts?
Types of Evaluation
Extrinsic Evaluation: Using Word Embeddings in Systems
Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
Limitations of Embeddings
Sub-word Embeddings (1)
Multi-prototype Embeddings • Simple idea, words with multiple meanings should have different embeddings (Reisinger and Mooney 2010)
Taught by
Graham Neubig