Overview
Learn about groundbreaking research in universal regression through this AutoML Seminar presentation that explores how language models can be trained as universal end-to-end regressors. Discover OmniPred, an innovative framework that transforms traditional regression approaches by enabling language models to perform precise numerical regression across diverse experimental datasets. Examine extensive experiments conducted using Google Vizier's vast blackbox optimization database, which demonstrate how text-based representations of mathematical parameters and values can achieve superior performance compared to conventional regression models. Gain insights into the methodology, implementation details, and practical applications of this novel approach through the speaker's comprehensive explanation and supporting materials, including the published paper and available source code on GitHub.
Syllabus
Xingyou (Richard) Song - OmniPred: Towards Universal Regressors with Language Models
Taught by
AutoML Seminars