Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Watch a 43-minute AutoML seminar presentation exploring innovative approaches to multi-objective optimization in neural architecture search (NAS). Learn how to balance performance and hardware metrics across multiple devices through a novel algorithm that encodes user preferences for trade-offs. Discover how the proposed method uses a hypernetwork to parameterize joint architectural distribution, enabling zero-shot transferability to new devices. Examine extensive experimental results involving 19 hardware devices and 3 objectives, demonstrating the method's effectiveness and scalability. Understand how this approach outperforms existing multi-objective optimization NAS methods across different search spaces and datasets, including MobileNetV3 on ImageNet-1k and Transformer space on machine translation. Presented by Arber Zela, the talk includes access to the research paper and implementation code for practical application.
Syllabus
Multi-objective Differentiable Neural Architecture Search
Taught by
AutoML Seminars