Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Constructing Physics-Consistent Neural Networks Using Probability Theory

Alan Turing Institute via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a novel approach to constructing neural networks that inherently obey physical laws in this comprehensive lecture from the Alan Turing Institute. Delve into the connection between probability, physics, and neural networks as the speaker begins with a simple single-layer neural network and applies the central limit theorem in the infinite-width limit. Learn how the neural network output becomes Gaussian under certain conditions, allowing for the application of Gaussian process theory. Discover how linear operators, including differential operators defining physical laws, act upon Gaussian processes to yield new Gaussian processes. Examine the concept of physics-consistency for Gaussian processes and its implications for infinite neural networks. Understand how to construct neural networks that obey physical laws by choosing activation functions that match particular kernels in the infinite-width limit. Analyze simple examples of the homogeneous 1D-Helmholtz equation and compare them to naive kernels and activations in this insightful 71-minute presentation.

Syllabus

Sascha Ranftl - A connection between probability, physics and neural network

Taught by

Alan Turing Institute

Reviews

Start your review of Constructing Physics-Consistent Neural Networks Using Probability Theory

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.