Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2019 - Word Vectors

Graham Neubig via YouTube

Overview

Explore word vectors in natural language processing through this comprehensive lecture from CMU's Neural Networks for NLP course. Dive into techniques for describing words by their context, including counting and prediction methods like skip-grams and CBOW. Learn how to evaluate and visualize word vectors, and discover advanced methods for creating more nuanced word representations. Examine the limitations of traditional embeddings and explore solutions like sub-word and multi-prototype embeddings. Gain insights into both intrinsic and extrinsic evaluation methods for word embeddings, and understand their practical applications in NLP systems.

Syllabus

Intro
What do we want to know about words?
A Manual Attempt: WordNet
An Answer (?): Word Embeddings!
Word Embeddings are Cool! (An Obligatory Slide)
How to Train Word Embeddings?
Distributional Representations (see Goldberg 10.4.1)
Count-based Methods
Word Embeddings from Language Models
Context Window Methods
Skip-gram (Mikolov et al. 2013) • Predict each word in the context given the word
Count-based and Prediction-based Methods
Glove (Pennington et al. 2014)
What Contexts?
Types of Evaluation
Extrinsic Evaluation: Using Word Embeddings in Systems
Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
Limitations of Embeddings
Sub-word Embeddings (1)
Multi-prototype Embeddings • Simple idea, words with multiple meanings should have different embeddings (Reisinger and Mooney 2010)

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2019 - Word Vectors

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.