Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intersection of information theory and topological data analysis in this insightful lecture on combining Kullback-Leibler divergence with Vietoris-Rips filtrations. Delve into the challenges of using the non-symmetric Kullback-Leibler divergence as a distance measure in topological constructions and discover an innovative approach to overcome this limitation. Learn about the concept of surprisal in probability distributions and its applications in deep learning. Gain valuable insights from the work of Hubert Wagner, Herbert Edelsbrunner, and Ziga Virk as they present a novel method for integrating information-theoretical distance measures with topological data analysis techniques.