Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

The Mother of All Representer Theorems for Inverse Problems and Machine Learning - Michael Unser

Alan Turing Institute via YouTube

Overview

Explore the foundations of inverse problems and machine learning in this 48-minute conference talk from the Alan Turing Institute. Delve into the "mother of all representer theorems" as presented by Michael Unser. Begin with an introduction to the variational formulation of inverse problems and the concept of learning as a linear inverse problem. Examine the Reproducing Kernel Hilbert Space (RKHS) representer theorem for machine learning before exploring the possibility of a unifying representer theorem. Investigate Banach spaces, their duals, and the generalization of duality mapping. Learn about kernel methods in machine learning, Tikhonov regularization, and the qualitative effects of Banach conjugation. Analyze sparsity-promoting regularization, extreme points, and the geometry of l2 vs. l1 minimization. Discover the isometry with the space of Radon measures and explore sparse kernel expansions, including the special case of translation-invariant kernels. Compare RKHS with sparse kernel expansions in the context of linear shift-invariant (LSI) systems. Gain valuable insights into the mathematical foundations underlying modern data science and machine learning techniques.

Syllabus

Intro
Variational formulation of inverse problem
Learning as a (linear) inverse problem
RKHS representer theorem for machine learning
Is there a mother of all representer theorems?
General notion of Banach space
Dual of a Banach space
Riesz conjugate for Hilbert spaces
Generalization: Duality mapping
Properties of duality mapping
Mother of all representer theorems (Cont'd)
Kernel methods for machine learning
Tikhonov regularization (see white board)
Qualitative effect of Banach conjugation
Sparsity promoting regularization
Extreme points
Geometry of 12 vs. l, minimization
Isometry with space of Radon measures
Sparse kernel expansions (Cont'd)
Special case: Translation-invariant kernels
RKHS vs. Sparse kernel expansions (LSI)
Conclusion (Cont'd)

Taught by

Alan Turing Institute

Reviews

Start your review of The Mother of All Representer Theorems for Inverse Problems and Machine Learning - Michael Unser

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.