Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn about dynamic GPU scaling in container applications through this conference talk that explores Composable Disaggregated Infrastructure (CDI) for the AI era. Discover how to address the growing computational demands of AI and ML in Kubernetes environments while maintaining energy efficiency. Explore the innovative CDI server architecture that enables on-demand resource allocation by composing devices like compute, memory, storage, and GPUs through PCIe or CXL switch fabric. Understand the implementation of CDI operators, Custom Resource Definitions, and advanced vertical and horizontal cluster auto-scaling capabilities. See demonstrations of dynamic device attachment and detachment in nodes using Dynamic Resource Allocation (DRA) functionality, providing a practical solution for balancing high performance with sustainable power consumption in modern container environments.