Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore Ray, an open-source distributed framework for scaling Python and machine learning applications, in this comprehensive talk by Jules Damji, Lead Developer Advocate at Anyscale. Dive into Ray's architecture, core concepts, and primitives like remote Tasks and Actors. Learn about Ray's native libraries including Ray Tune, Ray Train, Ray Serve, Ray Datasets, and RLlib. Watch a practical demonstration using XGBoost for classification to understand how to scale training, hyperparameter tuning, and inference from a single node to a cluster. Gain insights into distributed computing trends, Ray's ecosystem, and design patterns. Discover how to tackle challenges in hyperparameter tuning using techniques like Bayesian optimization and early stopping. Get hands-on with code examples and learn about worker processes in Ray. Understand the performance benefits of using Ray for various machine learning workloads and how it can be applied in production environments.
Syllabus
Introduction
Agenda
Industry Trends
Distributed Computing
Distributed Applications
Ray Ecosystem
Ray Internals
Ray Design Patterns
The Ray Ecosystem
Ray Tune
Ray Tune Search Algorithms
Hyperparameter Tuning
Hyperparameter Tuning Challenges
exhaustive search
Bayesian optimization
Early stop
Sample code
Worker processes
XCBoost Ray
Demo
Training
XRBoost Array
Hyperparameter Training
Example
Summary
Reinforcement Learning
Ray Community
Contact Jules
Taught by
Databricks