Linformer - Self-Attention with Linear Complexity

Linformer - Self-Attention with Linear Complexity

Yannic Kilcher via YouTube Direct link

- Intro & Overview

1 of 13

1 of 13

- Intro & Overview

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Linformer - Self-Attention with Linear Complexity

Automatically move to the next video in the Classroom when playback concludes

  1. 1 - Intro & Overview
  2. 2 - The Complexity of Self-Attention
  3. 3 - Embedding Dimension & Multiple Heads
  4. 4 - Formal Attention
  5. 5 - Empirical Investigation into RoBERTa
  6. 6 - Theorem: Self-Attention is Low Rank
  7. 7 - Linear Self-Attention Method
  8. 8 - Theorem: Linear Self-Attention
  9. 9 - Language Modeling
  10. 10 - NLP Benchmarks
  11. 11 - Compute Time & Memory Gains
  12. 12 - Broader Impact Statement
  13. 13 - Conclusion

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.