Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Pioneering a Hybrid SSM Transformer Architecture - Jamba Foundation Model

Databricks via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a groundbreaking conference talk on the development of Jamba, a novel Foundation Model based on a hybrid Transformer-Mamba mixture-of-experts (MoE) architecture. Delve into the decision-making process behind creating this innovative hybrid structure, and gain insights into its layered composition of SSM, Transformer, and MoE components. Learn how this flexible architecture enables resource- and objective-specific configurations, offering unprecedented throughput and the largest context window of 256K in its size class, while fitting 140K on a single GPU. Discover how Jamba introduces a paradigm shift in large language model development, presented by AI21 Labs CTO Barak Lenz. Access additional resources on LLM and MLOps, and connect with Databricks through various social media platforms for further exploration of cutting-edge AI technologies.

Syllabus

Pioneering a Hybrid SSM Transformer Architecture

Taught by

Databricks

Reviews

Start your review of Pioneering a Hybrid SSM Transformer Architecture - Jamba Foundation Model

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.