Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

DevOps for AI: Running LLMs in Production with Kubernetes and KubeFlow

WeAreDevelopers via YouTube

Overview

Explore the intersection of DevOps and AI in this 34-minute talk from WeAreDevelopers. Dive into the practical aspects of deploying and managing Large Language Models (LLMs) in production environments using Kubernetes and KubeFlow. Learn how to leverage these powerful tools to streamline the deployment process, ensure scalability, and maintain high performance for AI applications. Gain insights into best practices for containerization, orchestration, and workflow management specifically tailored for LLMs. Discover strategies to overcome common challenges in AI deployment and understand how DevOps principles can be applied to machine learning operations. Whether you're a developer, data scientist, or DevOps engineer, this talk provides valuable knowledge for running sophisticated AI models in real-world production scenarios.

Syllabus

DevOps for AI: running LLMs in production with Kubernetes and KubeFlow

Taught by

WeAreDevelopers

Reviews

Start your review of DevOps for AI: Running LLMs in Production with Kubernetes and KubeFlow

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.