Adding Attention Mechanisms to Language Translation RNNs in PyTorch - Lab 4.8

Adding Attention Mechanisms to Language Translation RNNs in PyTorch - Lab 4.8

Donato Capitella via YouTube Direct link

- Training Loop

5 of 8

5 of 8

- Training Loop

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Adding Attention Mechanisms to Language Translation RNNs in PyTorch - Lab 4.8

Automatically move to the next video in the Classroom when playback concludes

  1. 1 - Bahdanau Paper on Attention/Alingment
  2. 2 - Implementing a Simple Attention Mechanism
  3. 3 - Add Attention to the Decoder
  4. 4 - Inference / Forward Pass with Attention
  5. 5 - Training Loop
  6. 6 - Using/Evaluating the Trained Model
  7. 7 - Visualizing the Attention Matrix
  8. 8 - Comparing Our Attention Model to Bahdanau's

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.