Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Word2Vec- Distributed Representations of Words and Phrases and Their Compositionality

Yannic Kilcher via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the influential Word2Vec technique for generating distributed word representations in this comprehensive video lecture. Delve into the Skip-Gram model, hierarchical softmax, and negative sampling methods. Learn about the mysterious 3/4 power, frequent word subsampling, and their impact on training efficiency. Examine empirical results and gain insights into the practical applications of Word2Vec in modern natural language processing. Understand how this technique captures syntactic and semantic word relationships, and discover its limitations in representing word order and idiomatic phrases.

Syllabus

- Intro & Outline
- Distributed Word Representations
- Skip-Gram Model
- Hierarchical Softmax
- Negative Sampling
- Mysterious 3/4 Power
- Frequent Words Subsampling
- Empirical Results
- Conclusion & Comments

Taught by

Yannic Kilcher

Reviews

Start your review of Word2Vec- Distributed Representations of Words and Phrases and Their Compositionality

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.