Completed
- The SSM layer and forward propagation
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Mamba: Linear-Time Sequence Modeling with Selective State Spaces
Automatically move to the next video in the Classroom when playback concludes
- 1 - Introduction
- 2 - Transformers vs RNNs vs S4
- 3 - What are state space models?
- 4 - Selective State Space Models
- 5 - The Mamba architecture
- 6 - The SSM layer and forward propagation
- 7 - Utilizing GPU memory hierarchy
- 8 - Efficient computation via prefix sums / parallel scans
- 9 - Experimental results and comments
- 10 - A brief look at the code