Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore tree-based ensemble models in this EuroPython 2018 conference talk by Kevin Lemagnen. Dive into the world of Random Forest and Gradient Boosting, two powerful machine learning techniques that leverage bagging and boosting respectively. Learn how these ensemble models compare to Deep Learning and why they remain essential tools for data scientists. Discover their implementation in Python using popular libraries like LightGBM, XGBoost, and scikit-learn. Gain insights into the theory behind these models and their practical applications in solving a wide range of problems. Understand why ensemble models are often easier to tune and interpret than more complex alternatives. Follow along with the provided notebook to bridge the gap between theoretical concepts and hands-on implementation.
Syllabus
Introduction
Data
Random Forest
Boosting
Recommendations
Taught by
EuroPython Conference