Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 49-minute lecture on "Singular Learning, Relative Information and the Dual Numbers" presented by Shaowei Lin from the Topos Institute at IPAM's Theory and Practice of Deep Learning Workshop. Delve into the fundamental concept of relative information (Kullback-Leibler divergence) in statistics, machine learning, and information theory. Discover the definition and axiomatic properties of conditional relative information and its applications in machine learning, including Sumio Watanabe's Singular Learning Theory. Examine the rig category Info of random variables and conditional maps, as well as the rig category R(e) of dual numbers. Learn how relative information can be constructed through rig monoidal functors from Info to R(e). Gain insights into potential connections with information cohomology and operad derivations.