Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into the second part of a four-part video series on XGBoost, focusing on classification techniques. Learn how XGBoost trees are constructed for classification problems, building upon the regression concepts covered in part one. Explore key topics such as initial predictions, similarity scores, tree building, gain calculation, cover for classification, pruning, and the application of logistic regression. Gain a deeper understanding of how XGBoost adapts its algorithms for classification tasks, assuming prior knowledge of XGBoost trees for regression, gradient boost for classification, odds and log-odds, and the logistic function.
Syllabus
Intro
Overview
Initial Prediction
Similarity Scores
Building a Tree
Similarity Score
Gain
Cover
Cover for Classification
Pruning
Classification
Logistic Regression
Summary
Taught by
StatQuest with Josh Starmer