Overview
Explore a groundbreaking video that delves into the application of neural networks for solving complex mathematical problems like symbolic integration and differential equations. Learn about a novel syntax for representing mathematical problems and methods for generating large datasets to train sequence-to-sequence models. Discover how this approach outperforms commercial Computer Algebra Systems such as Matlab and Mathematica. Examine the paper by Guillaume Lample and François Charton, which challenges the notion that neural networks are limited to statistical or approximate problems. Gain insights into the use of Reverse Polish Notation and understand the intricacies of the model's functionality. Follow along as the video breaks down the process of integration and discusses important caveats in this innovative approach to symbolic mathematics.
Syllabus
Intro
Paper
How they did it
Reverse Polish Notation
How it works
Integration
Caveat
Taught by
Yannic Kilcher