Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore Microsoft's groundbreaking 17-billion parameter language model and the innovative ZeRO optimizer in this informative video. Dive into the technical details of how ZeRO enables efficient model and data parallelism without sacrificing training speed. Learn about the Turing-NLG model's state-of-the-art perplexity achievements and the DeepSpeed framework that powers it. Gain insights into the latest advancements in large-scale language model training and optimization techniques that are pushing the boundaries of natural language processing.