Constructing Physics-Consistent Neural Networks Using Probability Theory
Alan Turing Institute via YouTube
Overview
Explore a novel approach to constructing neural networks that inherently obey physical laws in this comprehensive lecture from the Alan Turing Institute. Delve into the connection between probability, physics, and neural networks as the speaker begins with a simple single-layer neural network and applies the central limit theorem in the infinite-width limit. Learn how the neural network output becomes Gaussian under certain conditions, allowing for the application of Gaussian process theory. Discover how linear operators, including differential operators defining physical laws, act upon Gaussian processes to yield new Gaussian processes. Examine the concept of physics-consistency for Gaussian processes and its implications for infinite neural networks. Understand how to construct neural networks that obey physical laws by choosing activation functions that match particular kernels in the infinite-width limit. Analyze simple examples of the homogeneous 1D-Helmholtz equation and compare them to naive kernels and activations in this insightful 71-minute presentation.
Syllabus
Sascha Ranftl - A connection between probability, physics and neural network
Taught by
Alan Turing Institute