Overview
Explore the growing demand for running Large Language Models (LLMs) in cloud environments through this 40-minute keynote presentation from the Linux Foundation. Gain insights into developers' and enterprises' pressing needs for open-source LLMs and learn best practices for cloud-native deployments. Examine three key deployment approaches: Python-based solutions, native runtimes like llama.cpp or vLLM, and WebAssembly as an abstraction layer. Understand the benefits, challenges, real-world applications, integration capabilities, portability aspects, and resource efficiency considerations for each deployment method. Dive into the CNCF CNAI ecosystem landscape while demystifying cloud-native AI implementation. Walk away with practical deployment strategies, a clear implementation roadmap, and the knowledge to evaluate and select the most appropriate LLM deployment approach for specific use cases.
Syllabus
Open Source LLMs in the Cloud: Scalable Solutions - Miley Fu, WasmEdge & Hung-Ying Tai, Second State
Taught by
Linux Foundation