Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Ray - A Framework for Scaling and Distributing Python and ML Applications

Databricks via YouTube

Overview

Explore Ray, an open-source distributed framework for scaling Python and machine learning applications, in this comprehensive talk by Jules Damji, Lead Developer Advocate at Anyscale. Dive into Ray's architecture, core concepts, and primitives like remote Tasks and Actors. Learn about Ray's native libraries including Ray Tune, Ray Train, Ray Serve, Ray Datasets, and RLlib. Watch a practical demonstration using XGBoost for classification to understand how to scale training, hyperparameter tuning, and inference from a single node to a cluster. Gain insights into distributed computing trends, Ray's ecosystem, and design patterns. Discover how to tackle challenges in hyperparameter tuning using techniques like Bayesian optimization and early stopping. Get hands-on with code examples and learn about worker processes in Ray. Understand the performance benefits of using Ray for various machine learning workloads and how it can be applied in production environments.

Syllabus

Introduction
Agenda
Industry Trends
Distributed Computing
Distributed Applications
Ray Ecosystem
Ray Internals
Ray Design Patterns
The Ray Ecosystem
Ray Tune
Ray Tune Search Algorithms
Hyperparameter Tuning
Hyperparameter Tuning Challenges
exhaustive search
Bayesian optimization
Early stop
Sample code
Worker processes
XCBoost Ray
Demo
Training
XRBoost Array
Hyperparameter Training
Example
Summary
Reinforcement Learning
Ray Community
Contact Jules

Taught by

Databricks

Reviews

Start your review of Ray - A Framework for Scaling and Distributing Python and ML Applications

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.