Scalable Training of Language Models Using Ray, JAX, and TPUv4
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the challenges and design decisions associated with developing a scalable training framework for large language models in this 34-minute conference talk from Ray Summit 2022. Delve into the quantitative analysis of efficiency improvements resulting from adopting new software and hardware solutions, including Ray, JAX pjit, and TPUv4. Learn about the distributed training strategies required for modern large language models due to their size, and gain insights into the rapid developments on both software and hardware frontiers that address the challenges of efficient and robust training.
Syllabus
Scalable training of language models using Ray, JAX, and TPUv4 at Cohere
Taught by
Anyscale