Explore a 42-minute lecture from the Joint IFML/MPG Symposium at the Simons Institute where UC Berkeley's Nikita Zhivotovskiy delves into overcoming the challenges of threshold classifier learning in adversarial sequential settings. Learn how to address the impossibility results for binary loss by examining alternative loss functions including quadratic, logarithmic, and hinge losses. Discover the adaptation of sequential linear regression, classification, and logistic regression techniques in scenarios with known but unordered design vectors. Understand how the exponential weights algorithm with data-dependent transductive priors can achieve regret bounds for unbounded norm optimal solutions, and examine classification regret bounds that depend solely on dimension and round numbers rather than specific design vectors or norms - an achievement previously thought impossible without prior design vector knowledge.
Overview
Syllabus
Bypassing the Impossibility of Online Learning Thresholds: Unbounded Losses and Transductive Priors
Taught by
Simons Institute