Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Training Large Language Models - GPT-NeoX-20B, BigScience BLOOM, OPT-175B Explained

Aleksa Gordić - The AI Epiphany via YouTube

Overview

Explore three groundbreaking large language model projects in this comprehensive video lecture. Delve into the BLOOM 176 billion parameter model by BigScience, the 175 billion parameter OPT model, and the 20 billion parameter GPT-NeoX-20B. Gain insights into the challenges and experiences of training these massive language models, including cluster deletions and dataset anomalies. Examine the papers, code, and shared weights of each project to deepen your understanding of large language models. Learn about the training processes, technical specifications, and unique features of each model through detailed explanations and chronicles from the researchers involved.

Syllabus

Intro
sponsored Weights & Biases
BLOOM paper
BLOOM chronicles
OPT paper
OPT chronicles
GPT-NeoX-20B paper
Outro

Taught by

Aleksa Gordić - The AI Epiphany

Reviews

Start your review of Training Large Language Models - GPT-NeoX-20B, BigScience BLOOM, OPT-175B Explained

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.