Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Understanding Neural Attention in Deep Learning - From Basics to Transformers

Neural Breakdown with AVB via YouTube

Overview

Learn the foundational concepts of Neural Attention through an 18-minute video that breaks down this revolutionary AI framework powering Transformer architectures and Large Language Models (LLMs). Progress from building a basic recommendation system to understanding contrastive learning, mean pooling, and weighted attention mechanisms. Master encoder-decoder attention and multiheaded attention concepts through visual illustrations and practical examples. Explore real-world applications in machine translation while examining key research papers like "Attention is All You Need" and "Neural Machine Translation by Jointly Learning to Align and Translate." Part of a comprehensive series on Neural Networks, with the follow-up video focusing on Self Attention.

Syllabus

- Intro
- Let's make a Recommendation System
- Contrastive Learning
- A Chatty Recommender
- Mean Pooling
- Attention as a weighted mean
- Math
- Machine Translation
- Encoder Decoder Attention
- More Neural Attention
- Multiheaded Attention
- Conclusion and Next Video!

Taught by

Neural Breakdown with AVB

Reviews

Start your review of Understanding Neural Attention in Deep Learning - From Basics to Transformers

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.