Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

XGBoost Part 2 - Classification

StatQuest with Josh Starmer via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into the second part of a four-part video series on XGBoost, focusing on classification techniques. Learn how XGBoost trees are constructed for classification problems, building upon the regression concepts covered in part one. Explore key topics such as initial predictions, similarity scores, tree building, gain calculation, cover for classification, pruning, and the application of logistic regression. Gain a deeper understanding of how XGBoost adapts its algorithms for classification tasks, assuming prior knowledge of XGBoost trees for regression, gradient boost for classification, odds and log-odds, and the logistic function.

Syllabus

Intro
Overview
Initial Prediction
Similarity Scores
Building a Tree
Similarity Score
Gain
Cover
Cover for Classification
Pruning
Classification
Logistic Regression
Summary

Taught by

StatQuest with Josh Starmer

Reviews

Start your review of XGBoost Part 2 - Classification

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.