Overview
Explore a cutting-edge approach to graph neural networks in this 54-minute lecture on topological relational learning. Delve into the novel topological neural framework of topological relational inference (TRI), designed to overcome the limitations of traditional GNNs. Learn how this method integrates higher-order graph information and systematically learns local graph structures by rewiring original graphs using persistent homology. Discover the theoretical stability guarantees for this new local topological representation and its implications on graph algebraic connectivity. Examine experimental results demonstrating TRI-GNN's superior performance in node classification tasks, outperforming 14 state-of-the-art baselines on 6 out of 7 graphs and exhibiting enhanced robustness to perturbations. Gain insights into topics such as infinite persistence, topological similarity, and recursive future programming schemes. Conclude with a discussion on the framework's applications to stem cells, citation networks, and boundary sensitivity.
Syllabus
Introduction
Overview
Contributions
Topological Induced Molecular Representation
Infinite Persistence
Topological Similarity
Topological Induced Multiple Fragmentation
Recursive Future Programming Scheme
Experiment
Stem Framework
Changing Graph Computer
Computational Capacity
Citation Networks
Boundary Sensitivity
Summary
Question
Taught by
Applied Algebraic Topology Network