Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Statistical Learning Theory and Neural Networks

Simons Institute via YouTube

Overview

Explore fundamental concepts in statistical learning theory and their application to deep neural networks in this comprehensive tutorial. Delve into uniform laws of large numbers and their relationship to function class complexity. Focus on Rademacher complexity as a key measure, examining upper bounds for deep ReLU networks. Investigate the apparent contradictions between modern neural network behaviors and classical intuitions. Gain insights into neural network training from an optimization perspective, reviewing gradient descent analysis for convex and smooth objectives. Understand the Polyak-Lojasiewicz (PL) inequality and its relevance to neural network training. Examine the neural tangent kernel (NTK) regime and its approximation of neural network training. Learn two approaches to establishing PL inequalities for neural networks: a general method based on NTK approximation and a specific technique for linearly-separable data.

Syllabus

Tutorial: Statistical Learning Theory and Neural Networks I

Taught by

Simons Institute

Reviews

Start your review of Statistical Learning Theory and Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.