Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Prioritized Training Using Reducible Holdout Loss Selection (RHO-LOSS) - Accelerating Machine Learning

USC Information Sciences Institute via YouTube

Overview

Discover a groundbreaking technique for accelerating machine learning training in this hour-long presentation by Sören Mindermann and Jan Brauner from the University of Oxford. Learn about Reducible Holdout Loss Selection (RHO-LOSS), a method that significantly reduces training time by selecting the most impactful data points. Explore how RHO-LOSS outperforms existing data selection methods by focusing on learnable, relevant, and yet-to-be-learned points. Gain insights into the technique's effectiveness across various datasets, hyperparameters, and architectures, including MLPs, CNNs, and BERT. Understand the impressive results achieved on the Clothing-1M dataset, where RHO-LOSS reduced training steps by 18 times while improving accuracy by 2%. Delve into topics such as online batch selection, loss selection, irreducible loss models, and GPU usage optimization. This presentation is essential for machine learning practitioners seeking to enhance training efficiency and model performance.

Syllabus

Introduction
Title
Motivation
Examples
Headline result
Online batch selection
Loss selection
Standard objective
Reducible holdout loss
Training
Irreducible loss models
GPU usage
Backward passes
Vision
Cola
Web Scribe
Question

Taught by

USC Information Sciences Institute

Reviews

Start your review of Prioritized Training Using Reducible Holdout Loss Selection (RHO-LOSS) - Accelerating Machine Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.