Serving Machine Learning Models at Scale Using KServe
CNCF [Cloud Native Computing Foundation] via YouTube
Overview
Explore the scalable deployment of machine learning models using KServe in this informative conference talk. Learn about the serverless open-source solution for serving machine learning models and discover how the KServe community designed a Multi-Model Serving solution to address limitations in the current 'one model, one service' paradigm. Delve into the design of Multi-Model Serving, understand its application for serving models across different frameworks, and examine benchmark statistics demonstrating its scalability. Gain insights into overcoming challenges related to compute resources, maximum pod numbers, IP address limitations, and service constraints when deploying large numbers of models, particularly those requiring GPU resources.
Syllabus
Serving Machine Learning Models at Scale Using KServing - Animesh Singh, IBM
Taught by
CNCF [Cloud Native Computing Foundation]