Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

The Quest for Adaptivity in Machine Learning - Comparing Popular Methods

Institut des Hautes Etudes Scientifiques (IHES) via YouTube

Overview

Explore the strengths and weaknesses of popular supervised learning algorithms in this 32-minute lecture by Francis Bach from INRIA, presented at the Institut des Hautes Etudes Scientifiques (IHES). Delve into the concept of "no free lunch theorems" and understand why there is no universal algorithm that performs well on all learning problems. Compare the performance of k-nearest-neighbor, kernel methods, and neural networks, examining their adaptivity, regularization, and optimization techniques. Investigate the curse of dimensionality, smoothness of prediction functions, and the role of latent variables in machine learning. Gain insights into the simplicity bias and overfitting issues associated with neural networks. Conclude with a comprehensive understanding of the trade-offs and considerations in choosing appropriate learning methods for different problem domains.

Syllabus

Intro
Supervised machine learning Classical formalization
Local averaging
Curse of dimensionality on X = Rd
Support of inputs
Smoothness of the prediction function
Latent variables
Need for adaptivity
From kernels to neural networks
Regularized empirical risk minimization
Adaptivity of kernel methods
Adaptivity of neural networks
Comparison of kernel and neural network regimes
Optimization for neural networks
Simplicity bias
Overfitting with neural networks
Conclusion

Taught by

Institut des Hautes Etudes Scientifiques (IHES)

Reviews

Start your review of The Quest for Adaptivity in Machine Learning - Comparing Popular Methods

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.