Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Deep Learning for Natural Language Processing

Alfredo Canziani via YouTube

Overview

Explore the foundations and advanced concepts of deep learning for Natural Language Processing (NLP) in this comprehensive lecture. Delve into various architectures used in NLP applications, including CNNs, RNNs, and the state-of-the-art transformer model. Understand the modules that make transformers advantageous for NLP tasks and learn effective training techniques. Discover beam search as a middle ground between greedy decoding and exhaustive search, and explore "top-k" sampling for text generation. Examine sequence-to-sequence models, back-translation, and unsupervised learning approaches for embedding, including word2vec, GPT, and BERT. Gain insights into pre-training techniques for NLP and future directions in the field.

Syllabus

– Week 12 – Lecture
– Introduction to deep learning in NLP and language models
– Transformer language model structure and intuition
– Some tricks and facts of Transformer Language Models and decoding Language Models
– Beam Search, Sampling and Text Generation
– Back-translation, word2vec and BERT's
– Pre-training for NLP and Next Steps

Taught by

Alfredo Canziani

Reviews

Start your review of Deep Learning for Natural Language Processing

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.