Explore a technical talk that delves into innovative approaches for Test-Time Adaptation (TTA) in foundation models, focusing on leveraging unlabeled data and redefining the concept of context. Learn how models can dynamically adapt to new target domains without expensive retraining by utilizing unsupervised in-context learning techniques. Discover methods for improving out-of-distribution generalization and aligning representations with task-specific inductive biases, including fairness constraints. Examine practical applications of context-driven TTA for enhancing model robustness in changing environments and enforcing task-specific inductive priors. Gain insights into the broader implications of this approach for world models, planning, and robust decision-making, presented by MIT PhD candidate Sharut Gupta, whose expertise spans multi-modal representation learning and out-of-distribution generalization.
Redefining Context for Powerful Test-Time Adaptation Using Unlabeled Data
Massachusetts Institute of Technology via YouTube
Overview
Syllabus
Sharut Gupta - Redefining Context for Powerful Test-Time Adaptation Using Unlabeled Data
Taught by
MIT Embodied Intelligence