Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore cutting-edge research in machine learning hardware, distributed training systems, and efficient computation techniques through a series of presentations by graduate students from the SAMPL Research Group at the University of Washington's Paul G. Allen School of Computer Science & Engineering. Dive into topics such as automatic synthesis of real-time machine learning hardware, optimized communication for cloud-based distributed training, dynamic tensor rematerialization for memory efficiency, alternative datatypes for representing real numbers, and automatic generation of quantized machine learning kernels. Gain insights into the latest advancements aimed at improving the performance, efficiency, and deployability of machine learning models across various computing environments.