Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore advanced optimizations for XGBoost in this fourth and final video of the series. Dive into techniques for handling large training datasets, including the Approximate Greedy Algorithm, Parallel Learning, Weighted Quantile Sketch, Sparsity-Aware Split Finding, Cache-Aware Access, and Blocks for Out-of-Core Computation. Learn step-by-step how XGBoost efficiently manages missing data, utilizes default paths, and optimizes performance for massive datasets. Gain insights into the practical applications of these advanced concepts in machine learning and data analysis.
Syllabus
Intro
Overview
Greedy Algorithm Limitations
Approximate Greedy Algorithm
Weighted Quantile Sketch
What is a Weighted Quantile
Weighted Quantiles in Classification
SparsityAware Split Finding
CacheAware Access
Core Computation
Random Subsets
Summary
Taught by
StatQuest with Josh Starmer