Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Mamba AI: Understanding Selective State Space Models as Transformer Alternatives

Discover AI via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 46-minute video lecture that delves into MAMBA (S6), a groundbreaking neural network architecture utilizing selective state space models for sequence modeling. Learn how this innovative approach serves as a potential alternative to traditional Transformer models, offering enhanced efficiency and capabilities particularly for processing long sequences. Understand the evolution from classical S4 models to MAMBA's sophisticated input-dependent SSM parameters, which enable selective information focus within sequences. Examine the potential impact of MAMBA on the current AI landscape and its possibility to revolutionize the widely-used transformer architecture that underlies most modern AI systems.

Syllabus

MAMBA AI (S6): Better than Transformers?

Taught by

Discover AI

Reviews

Start your review of Mamba AI: Understanding Selective State Space Models as Transformer Alternatives

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.