Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Building an ML Inference Platform with Knative - Serverless Containers on Kubernetes

CNCF [Cloud Native Computing Foundation] via YouTube

Overview

Explore the development of a machine learning inference platform using Knative in this informative conference talk. Learn how Bloomberg LP and IBM leveraged Knative's serverless capabilities to simplify and accelerate ML-driven application deployment and scaling in production environments. Discover the advantages of Knative for running serverless containers on Kubernetes, including automated networking, volume-based autoscaling, and revision tracking. Gain insights into the evolution of the KServe project and how Knative enables blue/green/canary rollout strategies for safe ML model updates. Understand how to improve GPU utilization with scale-to-zero functionality and build Apache Kafka events-based inference pipelines. Examine testing benchmarks comparing Knative to Kubernetes HPA and learn performance optimization tips for running numerous Knative services in a single cluster.

Syllabus

How We Built an ML inference Platform with Knative - Dan Sun, Bloomberg LP & Animesh Singh, IBM

Taught by

CNCF [Cloud Native Computing Foundation]

Reviews

Start your review of Building an ML Inference Platform with Knative - Serverless Containers on Kubernetes

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.