Probabilistic Inference Using Contraction of Tensor Networks
The Julia Programming Language via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore probabilistic inference using tensor network contraction in this 29-minute conference talk from JuliaCon 2024. Dive into the world of reasoning under uncertainty and learn how TensorInference.jl, a Julia package, combines probabilistic graphical models (PGMs) with tensor networks to enhance performance in complex probabilistic inference tasks. Discover the challenges of exact and approximate inference methods, and understand how tensor networks offer a powerful solution for representing complex system states. Gain insights into optimizing contraction sequences, leveraging differentiable programming, and utilizing advanced contraction methods like TreeSA, SABipartite, KaHyParBipartite, and GreedyMethod. Learn about the package's support for generic element types, hyper-optimized contraction order settings, and integration with BLAS routines and GPU technology for improved efficiency. Explore applications in AI, medical diagnosis, computer vision, and natural language processing while understanding the potential of exact methods in probabilistic inference.
Syllabus
Probabilistic inference using contraction of tensor networks | Roa-Villescas | JuliaCon 2024
Taught by
The Julia Programming Language