One of Three Theoretical Puzzles - Generalization in Deep Networks

One of Three Theoretical Puzzles - Generalization in Deep Networks

MITCBMM via YouTube Direct link

Intro

1 of 9

1 of 9

Intro

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

One of Three Theoretical Puzzles - Generalization in Deep Networks

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Deep Networks can avoid the curse of dimensionality for compositional functions
  3. 3 Minimize classification error minimize surrogate function
  4. 4 Motivation: generalization bounds for regression
  5. 5 GD unconstrained optimization gradient dynamical system
  6. 6 Example: Lagrange multiplier
  7. 7 Explicit norm constraint gives weight normalization
  8. 8 Overparametrized networks fit the data and generalize
  9. 9 Gradient Descent for deep RELU networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.