Combinatorial Invariance: A Case Study of Pure Math / Machine Learning Interaction - Geordie Williamson
Institute for Advanced Study via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a fascinating lecture on the intersection of pure mathematics and machine learning, focusing on the combinatorial invariance conjecture in Representation Theory. Delve into the collaborative efforts between Geordie Williamson and DeepMind to apply modern machine learning techniques to pure mathematical problems. Discover how neural networks, convolutional nets, and other ML models were utilized to shed light on Kazhdan-Lusztig polynomials and their relationship to directed graphs. Learn about the challenges of extracting new mathematical insights from these models and the resulting formula that provides fresh perspectives on the combinatorial invariance conjecture. Gain insights into topics such as perceptrons, neural nets, geometries, generalization, training, bruja graphs, and analytic polynomials. Visualize combinatorial invariance, predict KLL polynomials, and understand the concept of saliency in this context.
Syllabus
Introduction
Motivation
Perceptron
Neural nets
Convolutional nets
convolutional neural nets
geometries
generalizations
training
machine learning for mathematicians
bruja graph
analytic polynomials
Examples
Visualizing combinatorial invariance
Predicting KLL polynomials
Saliency
Taught by
Institute for Advanced Study