Completed
Intro
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Discovering Black-Box Optimizers via Evolutionary Meta-Learning
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 The Creation of Adam' - Michelangelo (ca. 1508 - 1512)
- 3 The Creation of AGI (by Adam) - ML Community
- 4 Envisioned excitement curve of this talk
- 5 What is Black-Box Optimization
- 6 How does an Evolution Strategy work?
- 7 Challenges for Modern Evolutionary Optimization?
- 8 What is the power of JAX for Evolutionary Optimization? Parallel/Accelerated Fitness Rollouts
- 9 evosax: Accelerated Evolutionary Optimization
- 10 Discovering New Algorithms via Meta-Learning
- 11 Discovering New Algorithms via Meta-Evolution
- 12 Why not use Meta-V instead of Meta-?
- 13 Discovering Evolutionary Optimizers (&)
- 14 White-Box Evolution Strategy: Gaussian Search
- 15 Learned Evolution Strategy (LES) Architecture
- 16 Meta-Training Details for LES Discovery BBOB Functions
- 17 Discovering LES: Meta-Training on Low-D BBOB
- 18 Evaluating LES: Brax Control Tasks
- 19 Scaling Meta-Distributions Improves LES Discovery
- 20 What Has The Learned Evolution Strategy Discovered?
- 21 Self-Referential Meta-Evolution of Learned ES
- 22 How does a Genetic Algorithm work?
- 23 Learned Genetic Algorithms (LGA) 9
- 24 LGA Generalizes to HPO-B & Neuroevolution Tasks
- 25 LGA Applies Adaptive Elitism & MR Adaptation
- 26 On Survivorship Bias & The Hardware Lottery (Hooker, 21)