Overview
Explore a comprehensive research talk where Google DeepMind scientist Yasaman Bahri delves into the fundamental principles of deep learning through the lens of statistical physics. Discover the connections between large-width neural networks, Gaussian processes, and kernels, with applications to physical sciences research. Learn about scaling laws that describe how machine learning models improve with increased training data, model size, and computational resources. Gain insights from the intersection of statistical physics, machine learning, and condensed matter physics as Bahri, a UC Berkeley Physics Ph.D. graduate and Les Houches School of Physics lecturer, shares her research findings. Understand how physics-based approaches and methodology can help uncover the underlying principles of neural network systems, moving beyond their treatment as black boxes. The presentation combines theoretical insights with practical applications, demonstrating how physics-inspired analysis contributes to advancing our understanding of modern AI and machine learning systems.
Syllabus
DDPS | “A first-principles approach to understanding deep learning”
Taught by
Inside Livermore Lab