Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Attention and Transformer Encoder Architecture

UofU Data Science via YouTube

Overview

Learn about the fundamental concepts and architecture of the Transformer Encoder model in this detailed 77-minute lecture. Explore the attention mechanism that revolutionized natural language processing and understand how this key component of modern transformer models processes and analyzes sequential data. Dive into the technical aspects of self-attention, multi-head attention, and positional encoding that make transformers highly effective for various machine learning tasks.

Syllabus

Lecture starts

Taught by

UofU Data Science

Reviews

Start your review of Attention and Transformer Encoder Architecture

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.