Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Serverless Machine Learning Model Inference on Kubernetes with KServe

Devoxx via YouTube

Overview

Explore serverless machine learning model inference on Kubernetes using KServe in this 38-minute conference talk from Devoxx. Learn how to integrate popular ML frameworks for easy model inference and prototyping, leverage Knative Serving for cost-effective autoscaling, build complex ML pipelines using inference graphs, and implement effective monitoring and deployment strategies. Gain practical insights into deploying your own models as various scenarios are described in detail, empowering you to take your ML deployments to the next level on Kubernetes.

Syllabus

Serverless Machine Learning Model Inference on Kubernetes with KServe by Stavros Kontopoulos

Taught by

Devoxx

Reviews

Start your review of Serverless Machine Learning Model Inference on Kubernetes with KServe

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.