Overview
Learn about hyperparameter optimization and algorithm configuration through a seminar talk that bridges the gap between AutoML and black-box optimization approaches. Explore how dynamic algorithm configuration can enhance traditional AutoML methods by adjusting algorithm behavior in real-time, rather than relying solely on pre-training data. Discover the challenges and opportunities in both fields as Professor Carola Doerr discusses the potential for developing unified approaches that combine the strengths of AutoML and black-box optimization. Delve into key topics including parameter control, Bayesian optimization vs CMA, standardization of data, benchmarking techniques for machine learning and DDQ methods, and explicit formulas. Gain insights into practical scenarios where traditional training-based hyperparameter optimization may not be feasible and learn alternative strategies for algorithm configuration in resource-constrained environments.
Syllabus
Introduction
My impression of automl
Blackbox optimization
What is blackbox optimization
Parameter control
Challenges
Holy Grail
Bayesian vs CMA
Standardization of data
Benchmarking ML techniques
Benchmarking DDQM
Explicit formulas
Summary
Taught by
AutoML Seminars