Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a groundbreaking approach to optimizing tensor programs in this 15-minute conference talk from OSDI '21. Dive into PET (Partially Equivalent Transformations), a novel DNN framework that revolutionizes program optimization by applying transformations that maintain partial functional equivalence. Learn how PET automatically corrects results to restore full equivalence, unlocking previously missed optimization opportunities. Discover the rigorous theoretical foundations behind simplifying equivalence examination and correction, and understand the efficient search algorithm that combines fully and partially equivalent optimizations at multiple levels. Gain insights into PET's superior performance compared to existing systems, with improvements of up to 2.5 times. Examine key challenges, the mutant generator concept, multi-linear tensor programs, and the program optimizer in this cutting-edge presentation on enhancing deep neural network efficiency.
Syllabus
Intro
Tensor Program Transformations
Current Systems Consider only Fully Equivalent Transformations
Motivating Example
PET Overview
Key Challenges
Mutant Generator
Challenges: Examine Transformations
A Strawman Approach
Multi-Linear Tensor Program (MLTP)
Mutant Corrector
Program Optimizer
More Evaluation in Paper
Taught by
USENIX