Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

CNCF [Cloud Native Computing Foundation]

Wasm as the Runtime for LLMs - Advantages and Applications

CNCF [Cloud Native Computing Foundation] via YouTube

Overview

Explore how WebAssembly (Wasm) is emerging as the preferred runtime for Large Language Models (LLMs) in this 27-minute conference talk by Michael Yuan from Second State. Learn why Python-based LLM applications are facing challenges due to speed, bloat, and installation complexities, and discover how compiled languages like C, C++, and Rust are gaining traction in LLM frameworks. Delve into the advantages of using Rust and Wasm for developing and running LLM applications, including improved efficiency, safety, and performance with a smaller footprint. Witness demonstrations of running Llama 2 models in Wasm and developing LLM agents in Rust. Gain insights into real-world applications, such as LLM-based code review and book-based learning assistants, through practical examples and demos presented by the speaker.

Syllabus

Wasm Is Becoming the Runtime for LLMs - Michael Yuan, Second State

Taught by

CNCF [Cloud Native Computing Foundation]

Reviews

Start your review of Wasm as the Runtime for LLMs - Advantages and Applications

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.