Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Linux Foundation

Hyperparameter Tuning Using Kubeflow

Linux Foundation via YouTube

Overview

Explore hyperparameter tuning and neural architecture search using Katib, a Kubernetes-native automated machine learning platform within Kubeflow. Learn how to optimize model performance by finding optimal constraints for training, and discover how networks generated by NAS algorithms can outperform handcrafted neural networks. Dive into Katib's rich set of management APIs, configure and run experiments, and compare performance using the UI dashboard. Follow along with demonstrations on setting up experiments, configuring search spaces, and viewing results and trial metrics. Gain insights into the landscape of automated machine learning, understand the workflow for neural architecture search, and learn about future developments and opportunities to contribute to this cutting-edge technology.

Syllabus

Intro
An Example: Digits Recognition with MNist
What is Hyperparameter Tuning?
Why is Hyperparameter Tuning Hard?
How does Kubernetes Help?
Introducing Kubeflow
Katib: Hyperparameter Tuning in Kubeflow
Concepts: Experiment
Concepts: Suggestion
Concepts: Trial
Workflow for Hyperparameter Tuning
System Architecture
Demo: Setting Up an Experiment
Demo: Configuring Search Space
Demo: Viewing Experiment Results
Demo: Viewing Trial Metrics
Classical vs Automated Machine Learning
Landscape of Automated Machine Learning
Workflow for Neural Architecture Search
What's Coming?
How to Contribute?

Taught by

Linux Foundation

Reviews

Start your review of Hyperparameter Tuning Using Kubeflow

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.