Explore cutting-edge research in machine learning hardware, distributed training systems, and efficient computation techniques through a series of presentations by graduate students from the SAMPL Research Group at the University of Washington's Paul G. Allen School of Computer Science & Engineering. Dive into topics such as automatic synthesis of real-time machine learning hardware, optimized communication for cloud-based distributed training, dynamic tensor rematerialization for memory efficiency, alternative datatypes for representing real numbers, and automatic generation of quantized machine learning kernels. Gain insights into the latest advancements aimed at improving the performance, efficiency, and deployability of machine learning models across various computing environments.
SAMPL Research Group Presentations on Machine Learning Hardware and Systems
Paul G. Allen School via YouTube
Overview
Syllabus
UW Allen School Colloquium: SAMPL Research Group
Taught by
Paul G. Allen School