Can Learning Theory Resist Deep Learning? - Francis Bach, INRIA

Can Learning Theory Resist Deep Learning? - Francis Bach, INRIA

Alan Turing Institute via YouTube Direct link

Intro

1 of 19

1 of 19

Intro

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Can Learning Theory Resist Deep Learning? - Francis Bach, INRIA

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Scientific context
  3. 3 Parametric supervised machine learning
  4. 4 Convex optimization problems
  5. 5 Exponentially convergent SGD for smooth finite sums
  6. 6 Exponentially convergent SGD for finite sums
  7. 7 Convex optimization for machine learning
  8. 8 Theoretical analysis of deep learning
  9. 9 Optimization for multi-layer neural networks
  10. 10 Gradient descent for a single hidden layer
  11. 11 Optimization on measures
  12. 12 Many particle limit and global convergence (Chizat and Bach, 2018a)
  13. 13 Simple simulations with neural networks
  14. 14 From qualitative to quantitative results ?
  15. 15 Lazy training (Chizat and Bach, 2018)
  16. 16 From lazy training to neural tangent kernel
  17. 17 Are state-of-the-art neural networks in the lazy regime?
  18. 18 Is the neural tangent kernel useful in practice?
  19. 19 Can learning theory resist deep learning?

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.