Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models

Unify via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a comprehensive presentation on OpenMoE, an early effort in open mixture-of-experts language models, delivered by Fuzhao Xue. Dive into the intricacies of this innovative approach to large language models, including the development of a series of open-source, decoder-only MoE LLMs ranging from 650M to 34B parameters. Learn about the cost-effectiveness of MoE models compared to dense LLMs, and gain insights into the routing mechanisms within these models. Discover key concepts such as Context-Independent Specialization and the challenges in routing decisions. Access additional resources, including the original research paper and related content from Unify, to deepen your understanding of this cutting-edge AI technology.

Syllabus

OpenMoE Explained

Taught by

Unify

Reviews

Start your review of OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.