Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2020 - Attention

Graham Neubig via YouTube

Overview

Explore attention mechanisms in neural networks for natural language processing in this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into various aspects of attention, including what to attend to, improvements to attention techniques, and specialized attention varieties. Examine a case study on the "Attention is All You Need" paper, and learn about attention score functions, input sentence handling, multi-headed attention, and training tricks. Gain insights into incorporating Markov properties, supervised training for attention, and hard attention concepts. This in-depth presentation covers essential topics for understanding and implementing attention in NLP models.

Syllabus

Intro
Sentence Representations
Calculating Attention (1)
A Graphical Example
Attention Score Functions (1)
Attention Score Functions (2)
Input Sentence: Copy
Input Sentence: Bias . If you have a translation dictionary, use it to bias outputs (Arthur et al. 2016)
Previously Generated Things
Various Modalities
Multiple Sources
Coverage
Incorporating Markov Properties (Cohn et al. 2015)
Supervised Training (Mi et al. 2016)
Hard Attention
Multi-headed Attention
Attention Tricks
Training Tricks

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2020 - Attention

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.