Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Offloading-Efficient Sparse AI Systems

ACM SIGPLAN via YouTube

Overview

Explore offloading-efficient sparse AI systems in this 19-minute conference talk presented at the SPARSE 2024 workshop. Gain insights from speaker Luo Mai as they delve into cutting-edge techniques for optimizing sparse artificial intelligence architectures. Learn about the latest advancements in offloading strategies designed to enhance the efficiency of sparse AI models. Discover how these approaches can potentially improve performance and resource utilization in AI systems. Understand the implications of this research for the future development of more efficient and scalable AI technologies. Engage with content from the SPARSE workshop, part of the ACM SIGPLAN-sponsored PLDI 2024 conference, focusing on innovations in sparse computing for AI applications.

Syllabus

[SPARSE24] Offloading-Efficient Sparse AI Systems

Taught by

ACM SIGPLAN

Reviews

Start your review of Offloading-Efficient Sparse AI Systems

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.