Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

CodeSignal

Ensemble Methods from Scratch

via CodeSignal

Overview

Learn about Ensemble Methods and their implementation from scratch. This course covers the understanding and implementation of multiple ensemble methods such as Bagging, Random Forest, AdaBoost, and Gradient Boosting Machines like XGBoost without relying on high-level libraries.

Syllabus

  • Lesson 1: Implementing Bagging with Decision Trees in Python
    • Ensemble Predictions with Bagging and Decision Trees
    • Navigating the Data Cosmos with Bootstrapping and Prediction Functions
    • Implementing Bootstrapping and Prediction in Ensemble Learning
    • Predicting with Bagging and Decision Trees
    • Observing Bagging with Decision Trees in Action
  • Lesson 2: Deep Dive into Random Forest: From Concepts to Real-World Application
    • Evaluating Random Forest Accuracy on Iris Dataset
    • Adjusting the Depth of Our RandomForest
    • Seeding the Forest: Random State Initialization
  • Lesson 3: Demystifying AdaBoost: A Practical Guide to Strengthening Predictive Models
    • AdaBoost Accuracy Demonstration
    • Tweaking the AdaBoost Learning Rate
    • Boosting the Weights in AdaBoost
    • AdaBoost Prediction Challenge
  • Lesson 4: Enhancing Machine Learning Predictions with Stacking Ensemble Techniques
    • Launching the Stacking Model into Orbit
    • Switching the Meta-Model in Stacking Ensemble
    • Stacking Ensemble: Combining Base Model Predictions
    • Assemble the Stacking Ensemble: Meta-Model Predictions

Reviews

Start your review of Ensemble Methods from Scratch

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.