Are Neural Networks Optimal Approximation Algorithms for Constraint Satisfaction Problems?
USC Probability and Statistics Seminar via YouTube
Overview
Explore the capabilities of neural networks in solving NP-hard optimization problems, particularly constraint satisfaction problems, in this 40-minute talk from the USC Probability and Statistics Seminar. Delve into the OptGNN graph neural network architecture and its ability to capture optimal approximation algorithms for constraint satisfaction. Discover how OptGNN functions as a convex program solver, providing bounds on combinatorial problem optimality. Examine the competitive performance of OptGNN against state-of-the-art unsupervised neural baselines in neural combinatorial optimization benchmarks. Gain insights into the connections between neural networks and computation, and consider potential avenues for future research in this field.
Syllabus
Morris Yau: Are Neural Networks Optimal Approximation Algorithms (MIT)
Taught by
USC Probability and Statistics Seminar