Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

How to Train a Large Language Model

Institute for Pure & Applied Mathematics (IPAM) via YouTube

Overview

Dive into a comprehensive conference talk on training large language models presented by Sam Smith of Google DeepMind at IPAM's Theory and Practice of Deep Learning Workshop. Explore key practical concepts behind LLM training, including a brief introduction to Transformers and the dominance of MLPs in computation. Gain insights into computational bottlenecks on TPUs and GPUs, techniques for training models too large for single-device memory, scaling laws, and hyper-parameter tuning. Delve into a detailed discussion of LLM inference and, time permitting, discover the design of recurrent models competitive with transformers, along with their advantages and drawbacks. Drawing from experiences with Griffin and RecurrentGemma, this 53-minute presentation offers valuable knowledge for those interested in the intricacies of LLM development and scaling.

Syllabus

Sam Smith - How to train an LLM - IPAM at UCLA

Taught by

Institute for Pure & Applied Mathematics (IPAM)

Reviews

Start your review of How to Train a Large Language Model

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.