Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

CMU Neural Nets for NLP - Distributional Semantics and Word Vectors

Graham Neubig via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore distributional semantics and word vectors in this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into describing words by their context, counting and predicting techniques, skip-grams and CBOW models, and methods for evaluating and visualizing word vectors. Learn about advanced word vector techniques, including contextualization, WordNet, and various embedding models like CBOW and GloVe. Examine different types of context, evaluation methods for embeddings, and their practical applications in NLP systems. Discuss the limitations of embeddings and explore sub-word embedding techniques. Gain valuable insights into the foundations of word representations in natural language processing.

Syllabus

Remember: Neural Models
How to Train Embeddings?
What do we want to know about words?
Contextualization of Word Representations
A Manual Attempt: WordNet
An Answer (?): Word Embeddings!
Distributional vs. Distributed Representations
Distributional Representations (see Goldberg 10.4.1)
Word Embeddings from Language Models
Context Window Methods
CBOW (Mikolov et al. 2013) • Predict word based on sum of surrounding embeddings
Glove (Pennington et al. 2014)
What Contexts?
Types of Evaluation
Intrinsic Evaluation of Embeddings (categorization from Schnabel et al 2015)
Extrinsic Evaluation: Using Word Embeddings in Systems
How Do I Choose Embeddings?
Limitations of Embeddings
Sub-word Embeddings (2)

Taught by

Graham Neubig

Reviews

Start your review of CMU Neural Nets for NLP - Distributional Semantics and Word Vectors

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.