Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Optimizing LLM Performance in Kubernetes with OpenTelemetry

CNCF [Cloud Native Computing Foundation] via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn how to optimize Large Language Model (LLM) deployments in Kubernetes environments through a comprehensive conference talk that explores advanced observability techniques using OpenTelemetry. Discover practical strategies for troubleshooting performance issues in LLM applications, with experts from Google and Microsoft demonstrating end-to-end implementation of client and server observability. Master the art of leveraging standardized metrics across LLM clients and model servers, while gaining insights into effective Kubernetes autoscaling using custom model server metrics instead of traditional GPU utilization metrics. Explore real-world performance optimization techniques and best practices for enhancing LLM server setups in Kubernetes deployments during this 32-minute presentation that addresses the growing challenges of managing LLM applications in cloud-native environments.

Syllabus

Optimizing LLM Performance in Kubernetes with OpenTelemetry - Ashok Chandrasekar & Liudmila Molkova

Taught by

CNCF [Cloud Native Computing Foundation]

Reviews

Start your review of Optimizing LLM Performance in Kubernetes with OpenTelemetry

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.