Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

The Challenges of Training Infinitely Large Neural Networks

Paul G. Allen School via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the challenges of training infinitely large neural networks in this distinguished seminar featuring Mikhail Belkin from UCSD. Delve into the recent successes of deep learning and the trend towards larger neural networks for improved performance. Discover why training infinitely large networks directly might be beneficial and learn about the Neural Tangent Kernel's role in equating infinitely wide neural networks to kernel machines. Examine the two primary challenges in training such networks: the inability to leverage feature learning and the computational difficulties in scaling kernel machines to large data sizes. Gain insights into Recursive Feature Machines (RFMs) that incorporate feature learning without backpropagation and outperform Multilayer Perceptrons. Understand the potential of RFMs in achieving state-of-the-art performance on tabular data and their efficiency for small to medium data sizes. Learn about recent efforts to scale kernel machines to larger datasets and the possibilities of reaching the data sizes used in modern Large Language Models (LLMs).

Syllabus

Distinguished Seminar in Optimization and Data: Mikhail Belkin (UCSD)

Taught by

Paul G. Allen School

Reviews

Start your review of The Challenges of Training Infinitely Large Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.