Neural Entropy - Understanding Deep Learning Through Information Theory and Diffusion Models
Valence Labs via YouTube
Overview
Explore a comprehensive lecture that delves into the fascinating intersection of deep learning and information theory through the lens of diffusion models. Learn how non-equilibrium thermodynamics principles help quantify the information needed to reverse diffusive processes, and discover how neural networks function similarly to Maxwell's demon during generation. Examine the innovative entropy matching model that demonstrates the precise correlation between network training information and entropy reversal requirements. Understand how entropy analysis reveals insights into network encoding efficiency and storage capacity, while synthesizing concepts from stochastic optimal control, thermodynamics, information theory, and optimal transport. Gain valuable insights into using diffusion models as experimental platforms for understanding neural network behavior, all supported by research published in the Neural Entropy paper.
Syllabus
Neural Entropy | Akhil Premkumar
Taught by
Valence Labs