Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

AI Deployment: Mastering LLMs with KFServing in Kubernetes

CNCF [Cloud Native Computing Foundation] via YouTube

Overview

Explore the intricacies of deploying Large Language Models (LLMs) in Kubernetes using KFServing in this informative 14-minute conference talk. Delve into the seamless integration of LLMs within cloud-native ecosystems, harnessing Kubernetes' scalability and KFServing's model serving capabilities. Learn best practices for deploying, managing, and optimizing LLMs in a Kubernetes environment, ensuring efficient resource utilization and high-performance inference. Gain valuable insights from Irvi Firqotul Aini of Mercari as she shares expertise on elevating AI deployment strategies in the rapidly evolving field of artificial intelligence. Perfect for AI practitioners and cloud engineers seeking to enhance their knowledge of cutting-edge LLM deployment techniques.

Syllabus

AI Deployment: Mastering LLMs with KFServing in Kubernetes - Irvi Firqotul Aini, Mercari

Taught by

CNCF [Cloud Native Computing Foundation]

Reviews

Start your review of AI Deployment: Mastering LLMs with KFServing in Kubernetes

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.