Explore the intersection of information theory and topological data analysis in this insightful lecture on combining Kullback-Leibler divergence with Vietoris-Rips filtrations. Delve into the challenges of using the non-symmetric Kullback-Leibler divergence as a distance measure in topological constructions and discover an innovative approach to overcome this limitation. Learn about the concept of surprisal in probability distributions and its applications in deep learning. Gain valuable insights from the work of Hubert Wagner, Herbert Edelsbrunner, and Ziga Virk as they present a novel method for integrating information-theoretical distance measures with topological data analysis techniques.
Topology of Surprisal - Information Theory and Vietoris-Rips Filtrations
Applied Algebraic Topology Network via YouTube
Overview
Syllabus
Hubert Wagner (4/14/23): Topology of... surprisal: Information theory and Vietoris-Rips filtrations
Taught by
Applied Algebraic Topology Network