Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Infinite Limits and Scaling Laws for Deep Neural Networks

Harvard CMSA via YouTube

Overview

Explore a comprehensive mathematics seminar presentation that delves into the scaling laws and infinite parameter limits of deep neural networks. Learn how increasing model size and training horizons have led to significant advances in computer vision and natural language processing, with a focus on understanding how finite parameter models improve as they grow larger. Examine the preservation of representation learning in infinite parameter limits and discover the convergence rates of finite models through dynamical mean field theory methods. Investigate the practical implications by comparing training dynamics of finite networks to idealized limits. Master a theoretical framework explaining how generalization depends on training time, model size, and data quantity, while understanding compute-optimal scaling strategies. Gain insights into how representation learning can enhance neural scaling laws, potentially doubling the training-time exponent compared to static kernel limits for complex tasks.

Syllabus

Blake Bordelon | Infinite Limits and Scaling Laws for Deep Neural Networks

Taught by

Harvard CMSA

Reviews

Start your review of Infinite Limits and Scaling Laws for Deep Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.