Fit Without Fear - An Over-Fitting Perspective on Modern Deep and Shallow Learning

Fit Without Fear - An Over-Fitting Perspective on Modern Deep and Shallow Learning

MITCBMM via YouTube Direct link

Interpolated classifiers work

9 of 31

9 of 31

Interpolated classifiers work

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Fit Without Fear - An Over-Fitting Perspective on Modern Deep and Shallow Learning

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Supervised ML
  3. 3 Interpolation and Overfitting
  4. 4 Modern ML
  5. 5 Fit without Fear
  6. 6 Overfitting perspective
  7. 7 Kernel machines
  8. 8 Interpolation in kernels
  9. 9 Interpolated classifiers work
  10. 10 what is going on?
  11. 11 Performance of kernels
  12. 12 Kernel methods for big data
  13. 13 The limits of smooth kernels
  14. 14 Eigenpro: practical implementation
  15. 15 Comparison with state-of-the-art
  16. 16 Improving speech intelligibility
  17. 17 Stochastic Gradient Descent
  18. 18 The Power of Interpolation
  19. 19 Optimality of mini-batch size 1
  20. 20 Minibatch size?
  21. 21 Real data example
  22. 22 Learning kernels for parallel computation?
  23. 23 Theory vs practice
  24. 24 Model complexity of interpolation?
  25. 25 How to test model complexity?
  26. 26 Testing model complexity for kernels
  27. 27 Levels of noise
  28. 28 Theoretical analyses fall short
  29. 29 Simplicial interpolation A-fit
  30. 30 Nearly Bayes optimal
  31. 31 Parting Thoughts I

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.