Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Vector Semantics and Word Embeddings in Natural Language Processing

UofU Data Science via YouTube

Overview

Learn about vector semantics and word embeddings in this comprehensive lecture covering fundamental concepts of natural language processing. Explore representation learning techniques and dive deep into various aspects of word meaning. Understand the distributional hypothesis and its importance in modern NLP applications. Master the principles of vector semantics and the influential word2vec model for creating word embeddings. Conclude by examining practical applications including visualization techniques, solving word analogies, and understanding potential biases in embedding models.

Syllabus

Lecture starts
Representation learning
Aspects of word meaning
Distributional hypothesis
Vector semantics
word2vec
Visualization, analogies, bias

Taught by

UofU Data Science

Reviews

Start your review of Vector Semantics and Word Embeddings in Natural Language Processing

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.