Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

10 Decision Trees are Better Than 1 - Random Forest and AdaBoost

via

Overview

Coursera Plus Flash Sale: All Certificates & Courses 40% Off. 72 Hours Only!
Explore the power of combining multiple decision trees into tree ensembles in this informative video. Delve into the two main types of tree ensembles: bagging (Random Forest) and boosting (AdaBoost, Gradient Boosting, XGBoost). Discover the three key benefits of using tree ensembles in machine learning. Follow along with a practical example of breast cancer prediction using ensemble methods. Access additional resources, including a blog post and example code, to further enhance your understanding of decision tree ensembles. Part of a comprehensive series on decision trees, this 17-minute tutorial provides valuable insights for both beginners and experienced data scientists looking to improve their predictive modeling skills.

Syllabus

Intro -
Tree Ensembles -
2 Types of Tree Ensembles -
1 Bagging Random Forest-
2 Boosting AdaBoost, Gradient Boosting, XGBoost -
3 Benefits of Tree Ensembles -
Example Code: Breast Cancer Prediction -

Taught by

Shaw Talebi

Reviews

Start your review of 10 Decision Trees are Better Than 1 - Random Forest and AdaBoost

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.