Explore Microsoft's groundbreaking 17-billion parameter language model and the innovative ZeRO optimizer in this informative video. Dive into the technical details of how ZeRO enables efficient model and data parallelism without sacrificing training speed. Learn about the Turing-NLG model's state-of-the-art perplexity achievements and the DeepSpeed framework that powers it. Gain insights into the latest advancements in large-scale language model training and optimization techniques that are pushing the boundaries of natural language processing.
Overview
Syllabus
Turing-NLG, DeepSpeed and the ZeRO optimizer
Taught by
Yannic Kilcher