Mastering LLM Delivery in Private Clouds: A Journey to Seamless Deployments with Kubernetes and OCI
CNCF [Cloud Native Computing Foundation] via YouTube
Overview
Explore a case study on simplifying private Large Language Model (LLM) deployments using cloud native technologies, specifically Kubernetes and OCI artifacts. Discover how these tools address data governance and security challenges while enabling efficient sharing of large artifacts between model developers and consumers. Learn about the benefits of Kubernetes in delivering a highly portable, cloud-native inference stack, and understand how OCI Artifacts can be leveraged to achieve significant efficiency gains by reducing duplicate storage, increasing download speed, and minimizing governance overhead. Gain valuable insights into incorporating Kubernetes and OCI into your MLOps journey for seamless LLM delivery in private cloud environments.
Syllabus
Mastering LLM Delivery in Private Clouds: A Journey to Seamless Dep... Autumn Moulder & Marwan Ahmed
Taught by
CNCF [Cloud Native Computing Foundation]