Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Unlocking Mixture of Experts - From One Know-it-all to a Group of Jedi Masters

EuroPython Conference via YouTube

Overview

Embark on an exhilarating journey exploring the Mixture of Experts (MoE) technique in this 31-minute conference talk at EuroPython 2024. Delve into the practical and intuitive next step for elevating predictive powers of generalized know-it-all models, particularly in critical domains like healthcare. Discover the powerful Divide and Conquer principle behind MoE, its limitations, pros, and cons. Progress through a captivating exploration of insights, intuitive reasoning, and solid mathematical underpinnings, enriched with interesting examples. Survey the landscape from ensemble models to stacked estimators, gradually ascending to MoE. Explore challenges, alternative routes, and learn when to apply MoE effectively. Conclude with a business-oriented discussion on metrics around cost, latency, and throughput for MoE models. Gain access to resources for diving into pre-trained MoE models, fine-tuning them, or creating your own from scratch.

Syllabus

Unlocking Mixture of Experts : From 1 Know-it-all to group of Jedi Masters — Pranjal Biyani

Taught by

EuroPython Conference

Reviews

Start your review of Unlocking Mixture of Experts - From One Know-it-all to a Group of Jedi Masters

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.