Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

CNCF [Cloud Native Computing Foundation]

Confidential Containers for GPU Compute - Incorporating LLMs in AI Lift-and-Shift Strategy

CNCF [Cloud Native Computing Foundation] via YouTube

Overview

Explore the evolution of confidential containers and their integration with GPU cloud-native stack for AI/ML workloads in this 32-minute conference talk. Delve into the transition from traditional to secure, isolated environments for sensitive data processing. Learn about the use of Kata for confidential container enablement, ensuring security while maintaining container flexibility. Examine a virtualization reference architecture supporting advanced scenarios like GPUdirect RDMA. Discover the lift-and-shift approach for seamless migration of existing AI/ML workloads to confidential environments. Understand how this integration combines LLMs with GPU-accelerated computing, leveraging Kubernetes for effective orchestration while balancing computational power and data privacy.

Syllabus

Confidential Containers for GPU Compute: Incorporating LLMs in a Lift-and-Shift Strategy for AI

Taught by

CNCF [Cloud Native Computing Foundation]

Reviews

Start your review of Confidential Containers for GPU Compute - Incorporating LLMs in AI Lift-and-Shift Strategy

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.