Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore cutting-edge techniques for enhancing cross-platform GPU/AI workloads in container ecosystems, focusing on Docker's integration of the WebGPU standard in this 36-minute conference talk. Learn how the WebGPU standard enables containerized applications to access host GPUs and AI accelerators through a versatile API, eliminating the need for Docker images tailored to specific GPU vendors and proprietary drivers. Watch a demonstration of the WasmEdge project leveraging WebGPU to create portable LLM inference applications in Rust, and see how Docker seamlessly manages and orchestrates these applications. Gain insights into the future of cross-platform AI development and deployment using Docker, and understand the potential impact on streamlining AI workflows across different hardware configurations.