Gradients for Everyone: A Quick Guide to Autodiff in Julia
The Julia Programming Language via YouTube
Overview
Explore the world of automatic differentiation (AD) in Julia through this informative conference talk. Dive into the core concepts behind taking gradients of arbitrary computer programs, a crucial element in scientific and machine learning breakthroughs. Compare Julia's approach to AD with Python's fragmented frameworks, and discover the vision of making the entire Julia language differentiable. Learn about various AD packages in Julia, including ForwardDiff, ReverseDiff, Zygote, and Enzyme, and understand their distinct tradeoffs. Gain insights from both package developer and user perspectives, covering topics such as classification of AD systems, forward and reverse modes, making code differentiable, and using differentiable code effectively. Acquire the knowledge needed to make informed decisions about AD implementation in your Julia projects.
Syllabus
Gradients for everyone: a quick guide to autodiff in Julia | Dalle, Hill | JuliaCon 2024
Taught by
The Julia Programming Language