Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the advanced topic of random tessellation forests in this 58-minute lecture by Eliza O'Reilly from the Hausdorff Center for Mathematics. Delve into the limitations of traditional random forests using axis-aligned partitions and discover how oblique splits can improve performance by capturing feature dependencies. Examine the class of random tessellations forests generated by the stable under iteration (STIT) process in stochastic geometry, and learn how they achieve minimax optimal convergence rates for Lipschitz and C2 functions. Investigate the connection between stationary random tessellations and statistical learning theory, focusing on strategies to overcome the curse of dimensionality in high-dimensional feature spaces through optimal directional distribution choices for random tessellation forest estimators.