Statistical Learning Theory for Modern Machine Learning - John Shawe-Taylor

Statistical Learning Theory for Modern Machine Learning - John Shawe-Taylor

Institute for Advanced Study via YouTube Direct link

Intro

1 of 21

1 of 21

Intro

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Statistical Learning Theory for Modern Machine Learning - John Shawe-Taylor

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Learning is to be able to generalise
  3. 3 Statistical Learning Theory is about high confidence
  4. 4 Error distribution picture
  5. 5 Mathematical formalization
  6. 6 What to achieve from the sample?
  7. 7 Risk (aka error) measures
  8. 8 Before PAC Bayes
  9. 9 The PAC-Bayes framework
  10. 10 PAC Bayes aka Generalised Bayes
  11. 11 PAC Bayes bounds vs. Bayesian learning
  12. 12 A General PAC Bayesian Theorem
  13. 13 Proof of the general theorem
  14. 14 Linear classifiers
  15. 15 Form of the SVM bound
  16. 16 Slack variable conversion
  17. 17 Observations
  18. 18 Deep Network Training Experiments
  19. 19 Training and Generalisation Results
  20. 20 A flexible framework
  21. 21 Conclusions

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.