Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore optimal learning from data in this 53-minute lecture by Ronald DeVore at the Hausdorff Center for Mathematics. Delve into the challenge of constructing approximations from functional observations of unknown functions. Examine quantitative theory, error measurement, and model class information. Discover methods for determining the smallest possible recovery error and constructing near-optimal discrete optimization problems. Investigate variants involving noisy data and explore joint research findings. Cover topics including the learning problem, mathematical questions, quantified performance, point evaluations, stochastic settings, deep learning, and bridging gaps in numerical approaches.
Syllabus
Intro
The Learning Problem
Mathematical Questions
Optimal Learning
The Numerical Challenge
Take aways from this Theorem
Quantified Performance
Point Evaluations
Similar Theory
Noisy Measurements
Solving the optimization problem
Data Sites: Do We Have Enough Data
The Stochastic Setting
Deep Learning (DL)
Other Numerical Objections
No Model Class or Penalty
Bridging the Gap
Summary
Taught by
Hausdorff Center for Mathematics