Completed
Learning as a (linear) inverse problem
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
The Mother of All Representer Theorems for Inverse Problems and Machine Learning - Michael Unser
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Variational formulation of inverse problem
- 3 Learning as a (linear) inverse problem
- 4 RKHS representer theorem for machine learning
- 5 Is there a mother of all representer theorems?
- 6 General notion of Banach space
- 7 Dual of a Banach space
- 8 Riesz conjugate for Hilbert spaces
- 9 Generalization: Duality mapping
- 10 Properties of duality mapping
- 11 Mother of all representer theorems (Cont'd)
- 12 Kernel methods for machine learning
- 13 Tikhonov regularization (see white board)
- 14 Qualitative effect of Banach conjugation
- 15 Sparsity promoting regularization
- 16 Extreme points
- 17 Geometry of 12 vs. l, minimization
- 18 Isometry with space of Radon measures
- 19 Sparse kernel expansions (Cont'd)
- 20 Special case: Translation-invariant kernels
- 21 RKHS vs. Sparse kernel expansions (LSI)
- 22 Conclusion (Cont'd)