Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

CNCF [Cloud Native Computing Foundation]

Cloud Native Sustainable LLM Inference in Action

CNCF [Cloud Native Computing Foundation] via YouTube

Overview

Explore sustainable Large Language Models (LLM) inference using cloud-native technologies in this comprehensive tutorial. Delve into LLMs, energy consumption, and Kepler's role in monitoring power during LLM workloads. Discover how to balance environmental sustainability with technological efficiency by leveraging AI accelerator frequency adjustments in Cloud Native tech for optimized LLM inference. Witness a live demonstration of vLLM, an advanced inference framework, and observe the fine-tuning of AI accelerator settings in a Kubernetes cluster to achieve an ideal power-computation balance. Gain valuable insights into the future of eco-friendly cloud computing, whether you're a developer, IT specialist, or sustainability advocate. Position yourself at the forefront of this significant technological evolution and learn how to integrate environmental sustainability with cloud-native technology solutions.

Syllabus

Tutorial: Cloud Native Sustainable LLM Inference in Action

Taught by

CNCF [Cloud Native Computing Foundation]

Reviews

Start your review of Cloud Native Sustainable LLM Inference in Action

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.