Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Understanding and Overcoming the Statistical Limitations of Decision Trees

Institute for Pure & Applied Mathematics (IPAM) via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a comprehensive lecture on the statistical limitations of decision trees and innovative approaches to overcome them. Delve into the performance gap between decision trees and more complex machine learning methods like random forests and deep learning. Examine sharp squared error generalization lower bounds for decision trees fitted to sparse additive generative models, and discover how these bounds connect to rate-distortion theory. Learn about the proposed Fast Interpretable Greedy-Tree Sums (FIGS) algorithm, which extends CART to grow multiple trees simultaneously. Investigate FIGS' ability to disentangle additive model components, reduce redundant splits, and improve prediction performance. Review experimental results across various datasets, showcasing FIGS' superiority over other rule-based methods in scenarios with limited splits. Gain insights into the application of FIGS in high-stakes domains, particularly its effectiveness in developing clinical decision instruments that outperform traditional tree-based methods by over 20%.

Syllabus

Abhineet Agarwal - Understanding and overcoming the statistical limitations of decision trees

Taught by

Institute for Pure & Applied Mathematics (IPAM)

Reviews

Start your review of Understanding and Overcoming the Statistical Limitations of Decision Trees

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.