Explore the groundbreaking advancements in training large-scale Machine Learning models through Microsoft's ZeRO-Infinity technology in this 27-minute video. Learn how this innovation overcomes previous GPU memory limitations, enabling the training of models with trillions of parameters using modest GPU resources and fine-tuning billion-parameter models on a single GPU. Discover the implications for working with extensive models like GPT-2 and understand the technical aspects of ZeRO-Infinity, including its forward step and parallelization techniques. Delve into the results and potential applications of this technology, which promises to revolutionize deep learning training by unlocking unprecedented model scale.
Overview
Syllabus
Intro
Motivation
Paper
Forward Step
Parallelization
Results
Taught by
Edan Meyer