Near Memory Compute for AI Inferencing - Optimizing Data Center Design and TCO
Open Compute Project via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 21-minute technical presentation that delves into innovative solutions for AI inferencing data center challenges, focusing on Near-Memory Compute (NMC) technology. Learn how low-cost remote memory connected through low-latency interconnects can optimize inferencing operations. Discover the benefits of offloading specific inferencing tasks to smaller cores positioned near remote memory, supported by simulation data demonstrating reduced execution latency. Understand the potential impact on Total Cost of Ownership (TCO) reduction in inferencing data centers through the implementation of cost-effective remote memory pools. Gain insights into forward-thinking data center design principles that prioritize both sustainability and operational efficiency.
Syllabus
Near Memory Compute for AI Inferencing
Taught by
Open Compute Project