Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Networks Meet Nonparametric Regression: Generalization by Weight Decay and Large Learning Rates

Simons Institute via YouTube

Overview

Explore the intersection of neural networks and nonparametric regression in this insightful lecture by Yu-Xiang Wang from UC San Diego. Delve into the reasons behind the superior performance of overparameterized deep learning models compared to classical methods like kernels and splines. Examine how standard hyperparameter tuning in deep neural networks (DNNs) implicitly uncovers hidden sparsity and low-dimensional structures, leading to improved adaptivity. Gain new perspectives on overparameterization, representation learning, and the generalization capabilities of neural networks through optimization-algorithm induced implicit biases such as Edge-of-Stability and Minima Stability. Analyze theory and examples that illustrate how DNNs achieve adaptive and near-optimal generalization, shedding light on their effectiveness in practical applications.

Syllabus

Neural Networks meet Nonparametric Regression: Generalization by Weight Decay and Large...

Taught by

Simons Institute

Reviews

Start your review of Neural Networks Meet Nonparametric Regression: Generalization by Weight Decay and Large Learning Rates

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.