Completed
- Entire Architecture Recap
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Infinite Memory Transformer - Research Paper Explained
Automatically move to the next video in the Classroom when playback concludes
- 1 - Intro & Overview
- 2 - Sponsor Spot: Weights & Biases
- 3 - Problem Statement
- 4 - Continuous Attention Mechanism
- 5 - Unbounded Memory via concatenation & contraction
- 6 - Does this make sense?
- 7 - How the Long-Term Memory is used in an attention layer
- 8 - Entire Architecture Recap
- 9 - Sticky Memories by Importance Sampling
- 10 - Commentary: Pros and cons of using heuristics
- 11 - Experiments & Results